Apr 17 17:05:08.304121 ip-10-0-132-98 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 17 17:05:08.304132 ip-10-0-132-98 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 17 17:05:08.304140 ip-10-0-132-98 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 17 17:05:08.304368 ip-10-0-132-98 systemd[1]: Failed to start Kubernetes Kubelet. Apr 17 17:05:19.646247 ip-10-0-132-98 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 17 17:05:19.646259 ip-10-0-132-98 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 9cd898077a314f25a5d7e8658f8e9821 -- Apr 17 17:07:41.823883 ip-10-0-132-98 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:07:42.207080 ip-10-0-132-98 kubenswrapper[2574]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:42.207080 ip-10-0-132-98 kubenswrapper[2574]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:07:42.207080 ip-10-0-132-98 kubenswrapper[2574]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:42.207080 ip-10-0-132-98 kubenswrapper[2574]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:07:42.207080 ip-10-0-132-98 kubenswrapper[2574]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:07:42.207881 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.207801 2574 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:07:42.211522 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211500 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:42.211522 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211520 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:42.211522 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211524 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:42.211522 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211527 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:42.211522 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211530 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211533 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211536 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211539 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211541 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211544 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211547 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211549 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211553 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211557 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211560 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211563 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211566 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211569 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211572 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211574 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211577 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211580 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211583 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:42.211726 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211585 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211588 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211590 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211593 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211596 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211599 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211616 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211619 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211625 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211627 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211630 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211633 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211636 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211638 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211641 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211643 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211647 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211650 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211653 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211656 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:42.212200 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211659 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211661 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211664 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211667 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211669 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211672 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211674 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211677 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211679 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211682 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211684 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211687 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211689 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211692 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211695 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211697 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211700 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211705 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211707 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:42.212697 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211710 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211713 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211715 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211718 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211721 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211724 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211726 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211729 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211731 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211734 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211736 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211740 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211742 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211745 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211747 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211751 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211759 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211762 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211764 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211767 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:42.213150 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211769 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211772 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211775 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.211777 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212196 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212202 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212205 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212208 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212210 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212213 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212215 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212218 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212220 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212223 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212225 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212228 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212231 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212233 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212236 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:42.213624 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212238 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212241 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212243 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212246 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212248 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212251 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212253 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212255 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212258 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212266 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212268 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212271 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212273 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212276 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212278 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212281 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212283 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212286 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212288 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212291 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:42.214074 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212293 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212295 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212298 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212300 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212303 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212305 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212307 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212310 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212312 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212315 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212318 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212320 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212322 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212325 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212327 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212329 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212332 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212334 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212337 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212339 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:42.214586 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212341 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212344 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212352 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212354 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212357 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212359 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212362 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212367 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212370 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212374 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212376 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212379 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212381 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212384 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212386 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212389 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212392 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212394 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212397 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:42.215070 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212399 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212402 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212405 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212408 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212410 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212413 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212417 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212419 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212423 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212426 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212428 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.212431 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212555 2574 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212567 2574 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212576 2574 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212581 2574 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212591 2574 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212594 2574 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212598 2574 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212615 2574 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212619 2574 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:07:42.215590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212622 2574 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212626 2574 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212629 2574 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212632 2574 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212634 2574 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212637 2574 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212640 2574 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212643 2574 flags.go:64] FLAG: --cloud-config="" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212646 2574 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212648 2574 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212655 2574 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212658 2574 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212661 2574 flags.go:64] FLAG: --config-dir="" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212663 2574 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212667 2574 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212671 2574 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212674 2574 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212677 2574 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212680 2574 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212683 2574 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212685 2574 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212688 2574 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212691 2574 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212694 2574 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212698 2574 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:07:42.216106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212701 2574 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212704 2574 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212707 2574 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212715 2574 flags.go:64] FLAG: --enable-server="true" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212718 2574 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212725 2574 flags.go:64] FLAG: --event-burst="100" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212728 2574 flags.go:64] FLAG: --event-qps="50" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212731 2574 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212734 2574 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212736 2574 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212740 2574 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212743 2574 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212746 2574 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212748 2574 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212751 2574 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212754 2574 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212756 2574 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212759 2574 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212762 2574 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212765 2574 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212768 2574 flags.go:64] FLAG: --feature-gates="" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212772 2574 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212775 2574 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212778 2574 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212781 2574 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212784 2574 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:07:42.216703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212787 2574 flags.go:64] FLAG: --help="false" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212789 2574 flags.go:64] FLAG: --hostname-override="ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212792 2574 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212795 2574 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212798 2574 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212801 2574 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212804 2574 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212806 2574 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212809 2574 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212812 2574 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212821 2574 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212824 2574 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212827 2574 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212830 2574 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212833 2574 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212835 2574 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212838 2574 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212841 2574 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212843 2574 flags.go:64] FLAG: --lock-file="" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212846 2574 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212848 2574 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212851 2574 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212856 2574 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:07:42.217319 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212858 2574 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212861 2574 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212864 2574 flags.go:64] FLAG: --logging-format="text" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212867 2574 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212870 2574 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212873 2574 flags.go:64] FLAG: --manifest-url="" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212876 2574 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212880 2574 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212883 2574 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212887 2574 flags.go:64] FLAG: --max-pods="110" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212889 2574 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212892 2574 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212895 2574 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212898 2574 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212900 2574 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212903 2574 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212906 2574 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212913 2574 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212916 2574 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212919 2574 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212930 2574 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212933 2574 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212939 2574 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212941 2574 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:07:42.217866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212945 2574 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212947 2574 flags.go:64] FLAG: --port="10250" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212950 2574 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212953 2574 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0652afbf6374420c8" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212956 2574 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212959 2574 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212962 2574 flags.go:64] FLAG: --register-node="true" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212964 2574 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212967 2574 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212971 2574 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212974 2574 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212977 2574 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212980 2574 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212983 2574 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212986 2574 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212989 2574 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212992 2574 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212995 2574 flags.go:64] FLAG: --runonce="false" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.212997 2574 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213000 2574 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213003 2574 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213019 2574 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213023 2574 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213026 2574 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213040 2574 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213044 2574 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:07:42.218414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213047 2574 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213050 2574 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213054 2574 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213063 2574 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213066 2574 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213068 2574 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213071 2574 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213077 2574 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213079 2574 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213082 2574 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213088 2574 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213091 2574 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213094 2574 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213096 2574 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213099 2574 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213102 2574 flags.go:64] FLAG: --v="2" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213107 2574 flags.go:64] FLAG: --version="false" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213110 2574 flags.go:64] FLAG: --vmodule="" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213115 2574 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.213118 2574 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213234 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213238 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213241 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213244 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:42.219021 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213247 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213250 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213252 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213254 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213257 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213260 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213262 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213265 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213267 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213270 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213277 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213280 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213290 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213293 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213296 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213298 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213301 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213304 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213306 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:42.219574 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213308 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213311 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213313 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213316 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213318 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213320 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213323 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213325 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213328 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213330 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213333 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213335 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213338 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213340 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213343 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213345 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213347 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213350 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213352 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213355 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:42.220073 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213357 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213359 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213362 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213367 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213369 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213372 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213379 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213382 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213384 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213387 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213390 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213392 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213395 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213397 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213399 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213402 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213405 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213407 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213410 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213413 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:42.220553 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213415 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213418 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213420 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213423 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213425 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213428 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213430 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213432 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213435 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213437 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213440 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213442 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213445 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213447 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213449 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213454 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213457 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213461 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213464 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213472 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:42.221095 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213475 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:42.221572 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213478 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:42.221572 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.213482 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:42.221572 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.214064 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.221566 2574 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.221582 2574 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221644 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221650 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221654 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221658 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221661 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221663 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221666 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221668 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221671 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221674 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:42.221669 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221676 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221679 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221682 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221684 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221687 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221690 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221692 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221695 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221697 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221701 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221703 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221706 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221708 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221711 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221713 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221716 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221718 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221721 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221723 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221725 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:42.221982 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221728 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221731 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221733 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221736 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221739 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221741 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221744 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221746 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221749 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221751 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221753 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221756 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221758 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221761 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221764 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221766 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221769 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221771 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221774 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221776 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:42.222450 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221779 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221781 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221784 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221786 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221789 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221792 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221794 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221796 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221799 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221801 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221804 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221806 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221809 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221813 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221818 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221821 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221824 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221828 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221832 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:42.222938 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221835 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221837 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221840 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221843 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221845 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221847 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221850 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221853 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221856 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221859 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221861 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221864 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221866 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221869 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221872 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221874 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:42.223463 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221877 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.221882 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221986 2574 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221991 2574 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221994 2574 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.221997 2574 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222000 2574 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222003 2574 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222006 2574 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222008 2574 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222011 2574 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222013 2574 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222017 2574 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222019 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222022 2574 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:07:42.223889 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222024 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222027 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222029 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222031 2574 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222034 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222036 2574 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222038 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222041 2574 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222044 2574 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222047 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222050 2574 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222053 2574 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222055 2574 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222057 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222060 2574 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222063 2574 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222066 2574 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222068 2574 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222072 2574 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:07:42.224258 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222075 2574 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222077 2574 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222080 2574 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222083 2574 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222086 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222088 2574 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222091 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222093 2574 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222095 2574 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222098 2574 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222100 2574 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222103 2574 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222105 2574 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222108 2574 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222110 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222113 2574 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222115 2574 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222118 2574 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222120 2574 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222123 2574 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:07:42.224783 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222125 2574 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222128 2574 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222131 2574 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222134 2574 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222136 2574 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222138 2574 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222141 2574 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222143 2574 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222146 2574 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222148 2574 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222150 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222153 2574 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222155 2574 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222157 2574 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222160 2574 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222162 2574 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222165 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222167 2574 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222170 2574 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222173 2574 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:07:42.225252 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222175 2574 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222177 2574 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222180 2574 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222182 2574 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222185 2574 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222188 2574 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222190 2574 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222192 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222195 2574 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222198 2574 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222201 2574 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222204 2574 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222207 2574 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:42.222210 2574 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.222215 2574 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:07:42.225727 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.223063 2574 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:07:42.226092 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.225180 2574 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:07:42.226150 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.226137 2574 server.go:1019] "Starting client certificate rotation" Apr 17 17:07:42.226252 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.226238 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:07:42.226280 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.226272 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:07:42.252215 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.252192 2574 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:07:42.254345 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.254330 2574 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:07:42.272924 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.272900 2574 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:07:42.278737 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.278720 2574 log.go:25] "Validated CRI v1 image API" Apr 17 17:07:42.279376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.279359 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:07:42.280019 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.280003 2574 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:07:42.282113 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.282088 2574 fs.go:135] Filesystem UUIDs: map[05363687-04ac-42b1-aee0-772ebcb30dbb:/dev/nvme0n1p3 2fb18963-5932-4e06-b2ba-0bcd981f8f92:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 17 17:07:42.282176 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.282113 2574 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:07:42.287071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.286964 2574 manager.go:217] Machine: {Timestamp:2026-04-17 17:07:42.285810041 +0000 UTC m=+0.358397225 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3199920 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec241ce4306c87868fd586aa53464ea1 SystemUUID:ec241ce4-306c-8786-8fd5-86aa53464ea1 BootID:9cd89807-7a31-4f25-a5d7-e8658f8e9821 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:25:4a:a2:84:43 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:25:4a:a2:84:43 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:be:87:e4:5b:c9:a2 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:07:42.287071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.287068 2574 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:07:42.287189 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.287179 2574 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:07:42.288199 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.288177 2574 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:07:42.288337 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.288201 2574 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-98.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:07:42.288378 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.288346 2574 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:07:42.288378 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.288354 2574 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:07:42.288378 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.288367 2574 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:07:42.289019 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.289009 2574 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:07:42.290285 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.290276 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:07:42.290554 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.290545 2574 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:07:42.292942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.292933 2574 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:07:42.292979 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.292950 2574 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:07:42.292979 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.292961 2574 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:07:42.292979 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.292970 2574 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:07:42.292979 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.292979 2574 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:07:42.293951 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.293939 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:07:42.293999 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.293957 2574 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:07:42.296686 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.296668 2574 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:07:42.298996 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.298981 2574 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:07:42.300286 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300272 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300293 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300302 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300307 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300313 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300319 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300325 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300330 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300336 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300342 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300359 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:07:42.300377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.300368 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:07:42.301101 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.301083 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:07:42.301101 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.301097 2574 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:07:42.304745 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.304732 2574 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:07:42.304823 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.304772 2574 server.go:1295] "Started kubelet" Apr 17 17:07:42.304900 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.304874 2574 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:07:42.304935 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.304875 2574 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:07:42.304966 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.304946 2574 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:07:42.305646 ip-10-0-132-98 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:07:42.306871 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.306699 2574 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:07:42.307345 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.307329 2574 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:07:42.309755 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.309711 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:07:42.309887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.309799 2574 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-98.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:07:42.310078 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.310053 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:07:42.311836 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.311802 2574 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:07:42.312260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.312243 2574 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:07:42.312903 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.312872 2574 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:07:42.312903 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.312873 2574 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:07:42.312903 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.312904 2574 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:07:42.313089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313046 2574 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:07:42.313089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313061 2574 factory.go:55] Registering systemd factory Apr 17 17:07:42.313089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313069 2574 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:07:42.313089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313079 2574 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:07:42.313089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313070 2574 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:07:42.313311 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.313232 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.313311 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313283 2574 factory.go:153] Registering CRI-O factory Apr 17 17:07:42.313311 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313296 2574 factory.go:223] Registration of the crio container factory successfully Apr 17 17:07:42.313438 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313317 2574 factory.go:103] Registering Raw factory Apr 17 17:07:42.313438 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313331 2574 manager.go:1196] Started watching for new ooms in manager Apr 17 17:07:42.313714 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.313701 2574 manager.go:319] Starting recovery of all containers Apr 17 17:07:42.314725 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.314705 2574 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:07:42.318212 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.318098 2574 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 17 17:07:42.318212 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.318170 2574 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 17 17:07:42.319206 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.318256 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-98.ec2.internal.18a733eaf875318e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-98.ec2.internal,UID:ip-10-0-132-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-98.ec2.internal,},FirstTimestamp:2026-04-17 17:07:42.304743822 +0000 UTC m=+0.377331006,LastTimestamp:2026-04-17 17:07:42.304743822 +0000 UTC m=+0.377331006,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-98.ec2.internal,}" Apr 17 17:07:42.321697 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.321679 2574 manager.go:324] Recovery completed Apr 17 17:07:42.324257 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.324232 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8pjpv" Apr 17 17:07:42.327030 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.326892 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:42.329446 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.329432 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:42.329526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.329464 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:42.329526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.329479 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:42.329971 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.329957 2574 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:07:42.330003 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.329973 2574 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:07:42.330003 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.329989 2574 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:07:42.331624 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.331548 2574 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-98.ec2.internal.18a733eaf9ee202a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-98.ec2.internal,UID:ip-10-0-132-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-98.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-98.ec2.internal,},FirstTimestamp:2026-04-17 17:07:42.329446442 +0000 UTC m=+0.402033630,LastTimestamp:2026-04-17 17:07:42.329446442 +0000 UTC m=+0.402033630,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-98.ec2.internal,}" Apr 17 17:07:42.333365 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.333351 2574 policy_none.go:49] "None policy: Start" Apr 17 17:07:42.333417 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.333372 2574 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:07:42.333417 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.333387 2574 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:07:42.333479 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.333466 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8pjpv" Apr 17 17:07:42.381182 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381164 2574 manager.go:341] "Starting Device Plugin manager" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.381233 2574 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381247 2574 server.go:85] "Starting device plugin registration server" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381491 2574 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381501 2574 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381629 2574 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381712 2574 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.381722 2574 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.382164 2574 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:07:42.407410 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.382196 2574 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.430020 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.429993 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:07:42.431278 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.431259 2574 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:07:42.431357 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.431282 2574 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:07:42.431357 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.431303 2574 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:07:42.431357 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.431312 2574 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:07:42.431357 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.431341 2574 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:07:42.434488 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.434465 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:42.482618 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.482535 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:42.483703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.483686 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:42.483776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.483717 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:42.483776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.483727 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:42.483776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.483753 2574 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.491654 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.491639 2574 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.491695 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.491662 2574 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-98.ec2.internal\": node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.502394 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.502379 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.532125 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.532101 2574 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal"] Apr 17 17:07:42.532200 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.532188 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:42.533673 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.533656 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:42.533738 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.533684 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:42.533738 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.533695 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:42.535850 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.535838 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:42.535996 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.535981 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.536037 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536011 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:42.536944 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536929 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:42.537024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536948 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:42.537024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536957 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:42.537024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536971 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:42.537024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536982 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:42.537149 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.536972 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:42.539250 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.539236 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.539315 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.539259 2574 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:07:42.539925 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.539915 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:07:42.540006 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.539935 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:07:42.540006 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.539944 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:07:42.563163 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.563146 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-98.ec2.internal\" not found" node="ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.567531 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.567513 2574 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-98.ec2.internal\" not found" node="ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.603002 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.602983 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.615381 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.615362 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5ab418bd0178840310111ee0f0df93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal\" (UID: \"ed5ab418bd0178840310111ee0f0df93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.615432 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.615390 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1953ee34347c78a8ab12e5fc6254beb6-config\") pod \"kube-apiserver-proxy-ip-10-0-132-98.ec2.internal\" (UID: \"1953ee34347c78a8ab12e5fc6254beb6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.615432 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.615414 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ed5ab418bd0178840310111ee0f0df93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal\" (UID: \"ed5ab418bd0178840310111ee0f0df93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.704039 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.704012 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.716139 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.716117 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5ab418bd0178840310111ee0f0df93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal\" (UID: \"ed5ab418bd0178840310111ee0f0df93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.716201 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.716148 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1953ee34347c78a8ab12e5fc6254beb6-config\") pod \"kube-apiserver-proxy-ip-10-0-132-98.ec2.internal\" (UID: \"1953ee34347c78a8ab12e5fc6254beb6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.716201 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.716164 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ed5ab418bd0178840310111ee0f0df93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal\" (UID: \"ed5ab418bd0178840310111ee0f0df93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.716294 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.716205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/ed5ab418bd0178840310111ee0f0df93-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal\" (UID: \"ed5ab418bd0178840310111ee0f0df93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.716294 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.716216 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ed5ab418bd0178840310111ee0f0df93-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal\" (UID: \"ed5ab418bd0178840310111ee0f0df93\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.716294 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.716244 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/1953ee34347c78a8ab12e5fc6254beb6-config\") pod \"kube-apiserver-proxy-ip-10-0-132-98.ec2.internal\" (UID: \"1953ee34347c78a8ab12e5fc6254beb6\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.804328 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.804238 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:42.865737 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.865704 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.870244 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:42.870223 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" Apr 17 17:07:42.904995 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:42.904964 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.005484 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.005459 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.106099 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.106022 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.206534 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.206499 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.227064 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.227040 2574 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:07:43.227582 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.227177 2574 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:07:43.306597 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.306565 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.312546 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.312524 2574 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:07:43.330166 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.330137 2574 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:07:43.335712 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.335663 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:02:42 +0000 UTC" deadline="2027-10-14 09:40:01.367794055 +0000 UTC" Apr 17 17:07:43.335790 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.335714 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13072h32m18.032086476s" Apr 17 17:07:43.346368 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.346347 2574 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:43.375364 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:43.375330 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1953ee34347c78a8ab12e5fc6254beb6.slice/crio-c2162218a0fc73f8521f95853f4e375ba9a81d10fb636711360fb738ba4d9f52 WatchSource:0}: Error finding container c2162218a0fc73f8521f95853f4e375ba9a81d10fb636711360fb738ba4d9f52: Status 404 returned error can't find the container with id c2162218a0fc73f8521f95853f4e375ba9a81d10fb636711360fb738ba4d9f52 Apr 17 17:07:43.375587 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:43.375567 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5ab418bd0178840310111ee0f0df93.slice/crio-b5c0303344d033222837254a1ccfcb7021e008e63f45514249e0625f81c95d87 WatchSource:0}: Error finding container b5c0303344d033222837254a1ccfcb7021e008e63f45514249e0625f81c95d87: Status 404 returned error can't find the container with id b5c0303344d033222837254a1ccfcb7021e008e63f45514249e0625f81c95d87 Apr 17 17:07:43.380338 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.380324 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:07:43.383438 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.383422 2574 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gkxdl" Apr 17 17:07:43.390282 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.390266 2574 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gkxdl" Apr 17 17:07:43.406980 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.406953 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.433985 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.433943 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" event={"ID":"ed5ab418bd0178840310111ee0f0df93","Type":"ContainerStarted","Data":"b5c0303344d033222837254a1ccfcb7021e008e63f45514249e0625f81c95d87"} Apr 17 17:07:43.434806 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.434782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" event={"ID":"1953ee34347c78a8ab12e5fc6254beb6","Type":"ContainerStarted","Data":"c2162218a0fc73f8521f95853f4e375ba9a81d10fb636711360fb738ba4d9f52"} Apr 17 17:07:43.508021 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.507994 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.608469 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.608444 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.708965 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:43.708928 2574 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-98.ec2.internal\" not found" Apr 17 17:07:43.771939 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.771910 2574 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:43.795529 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.795502 2574 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:43.813462 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.813436 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" Apr 17 17:07:43.825396 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.825369 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:07:43.826348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.826324 2574 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" Apr 17 17:07:43.837657 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:43.837631 2574 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:07:44.294369 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.294336 2574 apiserver.go:52] "Watching apiserver" Apr 17 17:07:44.302308 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.302284 2574 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:07:44.304010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.303977 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-4flxx","openshift-dns/node-resolver-52mt7","openshift-image-registry/node-ca-2f5xx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal","openshift-multus/multus-additional-cni-plugins-9c486","kube-system/konnectivity-agent-hrljp","openshift-multus/multus-bhrfj","openshift-multus/network-metrics-daemon-8zxs7","openshift-network-diagnostics/network-check-target-55bgp","openshift-network-operator/iptables-alerter-qkmhj","openshift-ovn-kubernetes/ovnkube-node-vbndw","kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl"] Apr 17 17:07:44.306654 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.306631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.308796 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.308777 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.309288 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.309273 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:07:44.309711 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.309472 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.309711 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.309520 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.309711 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.309475 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s2gp7\"" Apr 17 17:07:44.311155 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.311133 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.311251 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.311208 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.311251 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.311213 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-mnfwx\"" Apr 17 17:07:44.311890 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.311862 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.314262 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.314192 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.314389 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.314368 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.314599 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.314581 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-2thgw\"" Apr 17 17:07:44.314709 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.314659 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.314967 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.314952 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:07:44.316497 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.316481 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:07:44.316813 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.316797 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.317001 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.316982 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:07:44.317254 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.317241 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.317418 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.317398 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.317418 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.317418 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:07:44.317566 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.317552 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-2br7m\"" Apr 17 17:07:44.319119 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.319098 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.319477 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.319386 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-xdltp\"" Apr 17 17:07:44.319477 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.319403 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:07:44.320160 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.320144 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:07:44.321367 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.321346 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:07:44.321367 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.321360 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-9qw6x\"" Apr 17 17:07:44.321508 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.321444 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:44.321560 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.321515 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:44.323600 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.323578 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:44.323704 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.323665 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:44.325861 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325840 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-serviceca\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.325975 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325873 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-cnibin\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.325975 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-os-release\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.325975 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325914 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.325975 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325939 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/10abc3c7-cda3-4034-9da8-0c4935e396af-iptables-alerter-script\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.325975 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325942 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.325975 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.325961 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10abc3c7-cda3-4034-9da8-0c4935e396af-host-slash\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326007 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5fpk\" (UniqueName: \"kubernetes.io/projected/10abc3c7-cda3-4034-9da8-0c4935e396af-kube-api-access-d5fpk\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnjx9\" (UniqueName: \"kubernetes.io/projected/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-kube-api-access-vnjx9\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326099 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326136 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-host\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326169 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-cni-binary-copy\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326211 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b5a9dd27-c914-41cc-88fc-5a64c1169c04-hosts-file\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.326260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5a9dd27-c914-41cc-88fc-5a64c1169c04-tmp-dir\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.326526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326277 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr6gj\" (UniqueName: \"kubernetes.io/projected/b5a9dd27-c914-41cc-88fc-5a64c1169c04-kube-api-access-xr6gj\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.326526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326318 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-system-cni-dir\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.326526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326372 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.326526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.326441 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpjnc\" (UniqueName: \"kubernetes.io/projected/aed051f5-a966-45e4-867e-3841c1814af1-kube-api-access-fpjnc\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.328826 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.328211 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.328826 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.328263 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.329012 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.328990 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-kcdpt\"" Apr 17 17:07:44.329083 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.329047 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.330489 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.330470 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.332549 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.332365 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:07:44.332698 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.332621 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.332940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.332921 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.334052 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334033 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:07:44.334178 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334161 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:07:44.334382 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334366 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:07:44.334791 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334538 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:07:44.334896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-kln2x\"" Apr 17 17:07:44.334896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334580 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:07:44.334896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334692 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-vzwnp\"" Apr 17 17:07:44.334896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.334692 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:07:44.391049 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.391017 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:02:43 +0000 UTC" deadline="2027-09-24 01:36:39.371003462 +0000 UTC" Apr 17 17:07:44.391049 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.391047 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12584h28m54.979958411s" Apr 17 17:07:44.414217 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.414192 2574 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:07:44.421379 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.421353 2574 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:07:44.427482 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427462 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-socket-dir-parent\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.427589 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427494 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.427589 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427517 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-etc-selinux\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.427589 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427582 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-cnibin\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427622 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-os-release\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427649 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427661 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-cnibin\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427678 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3be1e497-a7c7-4b5d-be1e-1fd8df6acc62-agent-certs\") pod \"konnectivity-agent-hrljp\" (UID: \"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62\") " pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427704 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-var-lib-kubelet\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427728 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427740 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-os-release\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.427763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427752 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427776 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-cni-multus\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427800 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovnkube-config\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427839 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovnkube-script-lib\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427881 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-hostroot\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427921 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-sys-fs\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427954 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-run\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.427978 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-lib-modules\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-run-ovn-kubernetes\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428033 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b5a9dd27-c914-41cc-88fc-5a64c1169c04-hosts-file\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428058 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-kubernetes\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428082 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-var-lib-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-socket-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428095 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b5a9dd27-c914-41cc-88fc-5a64c1169c04-hosts-file\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428134 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/10abc3c7-cda3-4034-9da8-0c4935e396af-iptables-alerter-script\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428241 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysconfig\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428278 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-host\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428322 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-k8s-cni-cncf-io\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428355 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vnjx9\" (UniqueName: \"kubernetes.io/projected/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-kube-api-access-vnjx9\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428442 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqcmq\" (UniqueName: \"kubernetes.io/projected/1dd8584b-d217-441a-a0d1-e1b86328dfe2-kube-api-access-rqcmq\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428476 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-kubelet\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428519 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-systemd-units\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428522 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-tuning-conf-dir\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428545 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-daemon-config\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428586 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/10abc3c7-cda3-4034-9da8-0c4935e396af-iptables-alerter-script\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-cni-binary-copy\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.428665 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428642 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3be1e497-a7c7-4b5d-be1e-1fd8df6acc62-konnectivity-ca\") pod \"konnectivity-agent-hrljp\" (UID: \"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62\") " pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428682 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-ovn\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr6gj\" (UniqueName: \"kubernetes.io/projected/b5a9dd27-c914-41cc-88fc-5a64c1169c04-kube-api-access-xr6gj\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428737 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-system-cni-dir\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428762 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428794 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/aed051f5-a966-45e4-867e-3841c1814af1-system-cni-dir\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428786 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysctl-conf\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428845 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4gm9\" (UniqueName: \"kubernetes.io/projected/26531bc0-743e-4e55-9c40-fe448eea598c-kube-api-access-z4gm9\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428871 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-cni-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-cni-bin\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428931 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-serviceca\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-kubelet\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428962 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-multus-certs\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428976 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-log-socket\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.428994 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f40d551d-7b2b-4e50-afe2-fa8be6462803-cni-binary-copy\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429013 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-etc-kubernetes\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-host\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.429348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429047 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-cni-binary-copy\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429057 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429084 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysctl-d\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429100 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-cni-netd\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429112 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-host\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429115 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftv5p\" (UniqueName: \"kubernetes.io/projected/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-kube-api-access-ftv5p\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429140 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fpjnc\" (UniqueName: \"kubernetes.io/projected/aed051f5-a966-45e4-867e-3841c1814af1-kube-api-access-fpjnc\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429167 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-sys\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-etc-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429197 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-cni-bin\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429210 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-os-release\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429229 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10abc3c7-cda3-4034-9da8-0c4935e396af-host-slash\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429250 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d5fpk\" (UniqueName: \"kubernetes.io/projected/10abc3c7-cda3-4034-9da8-0c4935e396af-kube-api-access-d5fpk\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429266 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-env-overrides\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429279 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-netns\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429285 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/10abc3c7-cda3-4034-9da8-0c4935e396af-host-slash\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.430010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429294 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-registration-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429249 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/aed051f5-a966-45e4-867e-3841c1814af1-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429309 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-systemd\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26531bc0-743e-4e55-9c40-fe448eea598c-tmp\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429342 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-slash\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-cnibin\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-device-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429394 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-serviceca\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429405 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429427 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-run-netns\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429461 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-systemd\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429502 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovn-node-metrics-cert\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429532 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9dcfc\" (UniqueName: \"kubernetes.io/projected/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-kube-api-access-9dcfc\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-conf-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429637 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xp45r\" (UniqueName: \"kubernetes.io/projected/f40d551d-7b2b-4e50-afe2-fa8be6462803-kube-api-access-xp45r\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5a9dd27-c914-41cc-88fc-5a64c1169c04-tmp-dir\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429697 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-modprobe-d\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.430570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429716 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26531bc0-743e-4e55-9c40-fe448eea598c-etc-tuned\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.431097 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429737 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-node-log\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.431097 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.429758 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-system-cni-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.431097 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.430034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b5a9dd27-c914-41cc-88fc-5a64c1169c04-tmp-dir\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.443553 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.443524 2574 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:07:44.446770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.446738 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpjnc\" (UniqueName: \"kubernetes.io/projected/aed051f5-a966-45e4-867e-3841c1814af1-kube-api-access-fpjnc\") pod \"multus-additional-cni-plugins-9c486\" (UID: \"aed051f5-a966-45e4-867e-3841c1814af1\") " pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.446770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.446758 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr6gj\" (UniqueName: \"kubernetes.io/projected/b5a9dd27-c914-41cc-88fc-5a64c1169c04-kube-api-access-xr6gj\") pod \"node-resolver-52mt7\" (UID: \"b5a9dd27-c914-41cc-88fc-5a64c1169c04\") " pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.446872 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.446829 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5fpk\" (UniqueName: \"kubernetes.io/projected/10abc3c7-cda3-4034-9da8-0c4935e396af-kube-api-access-d5fpk\") pod \"iptables-alerter-qkmhj\" (UID: \"10abc3c7-cda3-4034-9da8-0c4935e396af\") " pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.447447 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.447426 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnjx9\" (UniqueName: \"kubernetes.io/projected/36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b-kube-api-access-vnjx9\") pod \"node-ca-2f5xx\" (UID: \"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b\") " pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.530125 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530092 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:44.530125 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530129 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysctl-d\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530327 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530150 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-cni-netd\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530327 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530302 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysctl-d\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530415 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530326 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ftv5p\" (UniqueName: \"kubernetes.io/projected/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-kube-api-access-ftv5p\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.530415 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530300 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-cni-netd\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530506 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530452 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-sys\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530506 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-etc-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530512 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-cni-bin\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530537 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-os-release\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.530583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530545 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-sys\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530564 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-env-overrides\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530555 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-etc-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530568 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-cni-bin\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530589 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-netns\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530629 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-registration-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530651 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-systemd\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530673 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26531bc0-743e-4e55-9c40-fe448eea598c-tmp\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530670 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-netns\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-registration-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530730 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-systemd\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530758 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-slash\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.530809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530786 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-cnibin\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530810 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-device-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530826 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-os-release\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530838 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530864 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-run-netns\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530893 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-systemd\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovn-node-metrics-cert\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530939 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9dcfc\" (UniqueName: \"kubernetes.io/projected/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-kube-api-access-9dcfc\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531034 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-env-overrides\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531089 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-run-netns\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531129 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-device-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.531164 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531162 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-systemd\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.531208 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.531299 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:45.031253886 +0000 UTC m=+3.103841082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531320 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-conf-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531352 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xp45r\" (UniqueName: \"kubernetes.io/projected/f40d551d-7b2b-4e50-afe2-fa8be6462803-kube-api-access-xp45r\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531391 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-conf-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530901 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-cnibin\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.531653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.530990 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-slash\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.531974 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-modprobe-d\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532162 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26531bc0-743e-4e55-9c40-fe448eea598c-etc-tuned\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-node-log\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532243 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-system-cni-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532299 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-node-log\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532379 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-socket-dir-parent\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532434 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-etc-selinux\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532464 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3be1e497-a7c7-4b5d-be1e-1fd8df6acc62-agent-certs\") pod \"konnectivity-agent-hrljp\" (UID: \"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62\") " pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532471 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-system-cni-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532485 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-var-lib-kubelet\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532506 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532529 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532537 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-kubelet-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-socket-dir-parent\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532653 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-cni-multus\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.532937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532704 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-etc-selinux\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.533736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532798 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-modprobe-d\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.533736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.533736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532894 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-var-lib-kubelet\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.533736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532954 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.533916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.532556 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-cni-multus\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.533950 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovnkube-config\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.533985 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovnkube-script-lib\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534024 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-hostroot\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534156 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534043 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-sys-fs\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.534156 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534067 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-run\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534156 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534094 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-lib-modules\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534156 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534124 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-run-ovn-kubernetes\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534156 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534154 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-kubernetes\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534184 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-var-lib-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-socket-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534237 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysconfig\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534265 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-host\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534292 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-k8s-cni-cncf-io\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534323 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqcmq\" (UniqueName: \"kubernetes.io/projected/1dd8584b-d217-441a-a0d1-e1b86328dfe2-kube-api-access-rqcmq\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:44.534356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534350 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-kubelet\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534374 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-systemd-units\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534402 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-daemon-config\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534432 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3be1e497-a7c7-4b5d-be1e-1fd8df6acc62-konnectivity-ca\") pod \"konnectivity-agent-hrljp\" (UID: \"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62\") " pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534461 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-ovn\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534493 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysctl-conf\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534497 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovn-node-metrics-cert\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534522 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4gm9\" (UniqueName: \"kubernetes.io/projected/26531bc0-743e-4e55-9c40-fe448eea598c-kube-api-access-z4gm9\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-cni-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534583 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-cni-bin\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534634 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-kubelet\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.534662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534664 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-multus-certs\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534688 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-log-socket\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f40d551d-7b2b-4e50-afe2-fa8be6462803-cni-binary-copy\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534749 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-etc-kubernetes\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-etc-kubernetes\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534902 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-host\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.534944 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-k8s-cni-cncf-io\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535037 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovnkube-script-lib\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.535109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535102 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-hostroot\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535161 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-sys-fs\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535164 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/26531bc0-743e-4e55-9c40-fe448eea598c-etc-tuned\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535181 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-kubelet\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-run\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535227 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-systemd-units\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-cni-dir\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-cni-bin\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535311 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-lib-modules\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535363 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-var-lib-openvswitch\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.535410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535407 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-var-lib-kubelet\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535834 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535445 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/f40d551d-7b2b-4e50-afe2-fa8be6462803-host-run-multus-certs\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.535834 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535737 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/3be1e497-a7c7-4b5d-be1e-1fd8df6acc62-konnectivity-ca\") pod \"konnectivity-agent-hrljp\" (UID: \"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62\") " pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.535834 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535803 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-host-run-ovn-kubernetes\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.535834 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535818 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26531bc0-743e-4e55-9c40-fe448eea598c-tmp\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f40d551d-7b2b-4e50-afe2-fa8be6462803-cni-binary-copy\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-kubernetes\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535919 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-ovnkube-config\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535934 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-log-socket\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535969 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysctl-conf\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535977 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-run-ovn\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.536070 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.535980 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/26531bc0-743e-4e55-9c40-fe448eea598c-etc-sysconfig\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.536410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.536147 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-socket-dir\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.536469 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.536441 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/f40d551d-7b2b-4e50-afe2-fa8be6462803-multus-daemon-config\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.536518 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.536464 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/3be1e497-a7c7-4b5d-be1e-1fd8df6acc62-agent-certs\") pod \"konnectivity-agent-hrljp\" (UID: \"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62\") " pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.536834 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.536814 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:44.536907 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.536846 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:44.536907 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.536860 2574 projected.go:194] Error preparing data for projected volume kube-api-access-tkm7c for pod openshift-network-diagnostics/network-check-target-55bgp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:44.536993 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:44.536923 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c podName:94d54b6e-1e0b-4ee4-b498-c5e45fb87940 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:45.0369056 +0000 UTC m=+3.109492776 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-tkm7c" (UniqueName: "kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c") pod "network-check-target-55bgp" (UID: "94d54b6e-1e0b-4ee4-b498-c5e45fb87940") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:44.538976 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.538949 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9dcfc\" (UniqueName: \"kubernetes.io/projected/fb032f94-bfb7-47c8-b2bb-9e3c7a412058-kube-api-access-9dcfc\") pod \"ovnkube-node-vbndw\" (UID: \"fb032f94-bfb7-47c8-b2bb-9e3c7a412058\") " pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.539429 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.539409 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftv5p\" (UniqueName: \"kubernetes.io/projected/ac8a025c-d2a3-45bf-abc9-c3470d14e41a-kube-api-access-ftv5p\") pod \"aws-ebs-csi-driver-node-gh4nl\" (UID: \"ac8a025c-d2a3-45bf-abc9-c3470d14e41a\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:44.541450 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.541431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xp45r\" (UniqueName: \"kubernetes.io/projected/f40d551d-7b2b-4e50-afe2-fa8be6462803-kube-api-access-xp45r\") pod \"multus-bhrfj\" (UID: \"f40d551d-7b2b-4e50-afe2-fa8be6462803\") " pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.543341 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.543314 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqcmq\" (UniqueName: \"kubernetes.io/projected/1dd8584b-d217-441a-a0d1-e1b86328dfe2-kube-api-access-rqcmq\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:44.543448 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.543435 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4gm9\" (UniqueName: \"kubernetes.io/projected/26531bc0-743e-4e55-9c40-fe448eea598c-kube-api-access-z4gm9\") pod \"tuned-4flxx\" (UID: \"26531bc0-743e-4e55-9c40-fe448eea598c\") " pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.618791 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.618706 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qkmhj" Apr 17 17:07:44.627281 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.627256 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-52mt7" Apr 17 17:07:44.634753 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.634731 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-2f5xx" Apr 17 17:07:44.643295 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.643272 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-9c486" Apr 17 17:07:44.648762 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.648744 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:07:44.655250 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.655228 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bhrfj" Apr 17 17:07:44.661840 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.661824 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-4flxx" Apr 17 17:07:44.666415 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.666397 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:07:44.672020 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:44.671993 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" Apr 17 17:07:45.028962 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.028777 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3be1e497_a7c7_4b5d_be1e_1fd8df6acc62.slice/crio-4e272ac05e1855805f725263a5eb6206a8f3263426416727a6939cf801df51f9 WatchSource:0}: Error finding container 4e272ac05e1855805f725263a5eb6206a8f3263426416727a6939cf801df51f9: Status 404 returned error can't find the container with id 4e272ac05e1855805f725263a5eb6206a8f3263426416727a6939cf801df51f9 Apr 17 17:07:45.031115 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.031080 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26531bc0_743e_4e55_9c40_fe448eea598c.slice/crio-b55e0a532989979d7b755e755a5fb9c8267d9577f6740310225bccad591e985f WatchSource:0}: Error finding container b55e0a532989979d7b755e755a5fb9c8267d9577f6740310225bccad591e985f: Status 404 returned error can't find the container with id b55e0a532989979d7b755e755a5fb9c8267d9577f6740310225bccad591e985f Apr 17 17:07:45.034047 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.034024 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac8a025c_d2a3_45bf_abc9_c3470d14e41a.slice/crio-0dea1f1b08cee541cf9a1349155ec7aa6e0085b6e07be85155466742f05a33b6 WatchSource:0}: Error finding container 0dea1f1b08cee541cf9a1349155ec7aa6e0085b6e07be85155466742f05a33b6: Status 404 returned error can't find the container with id 0dea1f1b08cee541cf9a1349155ec7aa6e0085b6e07be85155466742f05a33b6 Apr 17 17:07:45.035084 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.035066 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb032f94_bfb7_47c8_b2bb_9e3c7a412058.slice/crio-8c78374a6b3c2d771fc55582986ac292620f823ab8191ad993ea1ea6e4e82f70 WatchSource:0}: Error finding container 8c78374a6b3c2d771fc55582986ac292620f823ab8191ad993ea1ea6e4e82f70: Status 404 returned error can't find the container with id 8c78374a6b3c2d771fc55582986ac292620f823ab8191ad993ea1ea6e4e82f70 Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.036634 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36e5ce0d_a66e_4ad4_bf76_a0b0c1e1890b.slice/crio-253295f3e9c2e6e9833511bd8ddedaf134909553edacc00b006e3189f11fc132 WatchSource:0}: Error finding container 253295f3e9c2e6e9833511bd8ddedaf134909553edacc00b006e3189f11fc132: Status 404 returned error can't find the container with id 253295f3e9c2e6e9833511bd8ddedaf134909553edacc00b006e3189f11fc132 Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.037295 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.037342 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:45.037450 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:45.037498 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:46.037480692 +0000 UTC m=+4.110067877 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:45.037569 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:45.037581 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:45.037592 2574 projected.go:194] Error preparing data for projected volume kube-api-access-tkm7c for pod openshift-network-diagnostics/network-check-target-55bgp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:45.038138 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:45.037669 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c podName:94d54b6e-1e0b-4ee4-b498-c5e45fb87940 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:46.037656937 +0000 UTC m=+4.110244123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-tkm7c" (UniqueName: "kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c") pod "network-check-target-55bgp" (UID: "94d54b6e-1e0b-4ee4-b498-c5e45fb87940") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:45.040442 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.040415 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10abc3c7_cda3_4034_9da8_0c4935e396af.slice/crio-974159c2078a07670cfd3052f080b8ce5689d8503d4c6edd3612b76b17d7bf50 WatchSource:0}: Error finding container 974159c2078a07670cfd3052f080b8ce5689d8503d4c6edd3612b76b17d7bf50: Status 404 returned error can't find the container with id 974159c2078a07670cfd3052f080b8ce5689d8503d4c6edd3612b76b17d7bf50 Apr 17 17:07:45.040834 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.040803 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5a9dd27_c914_41cc_88fc_5a64c1169c04.slice/crio-c717dbca5b94b001937daca4dce59dc949c24a9a2198f4929dd3a818b5b6d612 WatchSource:0}: Error finding container c717dbca5b94b001937daca4dce59dc949c24a9a2198f4929dd3a818b5b6d612: Status 404 returned error can't find the container with id c717dbca5b94b001937daca4dce59dc949c24a9a2198f4929dd3a818b5b6d612 Apr 17 17:07:45.042050 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.042030 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf40d551d_7b2b_4e50_afe2_fa8be6462803.slice/crio-8e3684d8540900f63c3e0d6cc20007be03359f4b31d68e4cd625d6e60d14070a WatchSource:0}: Error finding container 8e3684d8540900f63c3e0d6cc20007be03359f4b31d68e4cd625d6e60d14070a: Status 404 returned error can't find the container with id 8e3684d8540900f63c3e0d6cc20007be03359f4b31d68e4cd625d6e60d14070a Apr 17 17:07:45.043723 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:07:45.043701 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaed051f5_a966_45e4_867e_3841c1814af1.slice/crio-1eee7ff750d09c678477fa2f5a24351d5583bead11ec63382ef20eb4b9cef84e WatchSource:0}: Error finding container 1eee7ff750d09c678477fa2f5a24351d5583bead11ec63382ef20eb4b9cef84e: Status 404 returned error can't find the container with id 1eee7ff750d09c678477fa2f5a24351d5583bead11ec63382ef20eb4b9cef84e Apr 17 17:07:45.391322 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.391223 2574 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:02:43 +0000 UTC" deadline="2027-10-16 11:47:48.372129469 +0000 UTC" Apr 17 17:07:45.391322 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.391257 2574 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13122h40m2.980875558s" Apr 17 17:07:45.439388 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.439349 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qkmhj" event={"ID":"10abc3c7-cda3-4034-9da8-0c4935e396af","Type":"ContainerStarted","Data":"974159c2078a07670cfd3052f080b8ce5689d8503d4c6edd3612b76b17d7bf50"} Apr 17 17:07:45.440477 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.440449 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-52mt7" event={"ID":"b5a9dd27-c914-41cc-88fc-5a64c1169c04","Type":"ContainerStarted","Data":"c717dbca5b94b001937daca4dce59dc949c24a9a2198f4929dd3a818b5b6d612"} Apr 17 17:07:45.443839 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.443817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" event={"ID":"1953ee34347c78a8ab12e5fc6254beb6","Type":"ContainerStarted","Data":"053d45a5abd77912e19b3c8310db4bf6baba479098a5210cf5b7b5881d9428f7"} Apr 17 17:07:45.448424 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.448394 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhrfj" event={"ID":"f40d551d-7b2b-4e50-afe2-fa8be6462803","Type":"ContainerStarted","Data":"8e3684d8540900f63c3e0d6cc20007be03359f4b31d68e4cd625d6e60d14070a"} Apr 17 17:07:45.453528 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.452721 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2f5xx" event={"ID":"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b","Type":"ContainerStarted","Data":"253295f3e9c2e6e9833511bd8ddedaf134909553edacc00b006e3189f11fc132"} Apr 17 17:07:45.458296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.458271 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"8c78374a6b3c2d771fc55582986ac292620f823ab8191ad993ea1ea6e4e82f70"} Apr 17 17:07:45.458296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.458250 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-98.ec2.internal" podStartSLOduration=2.458236468 podStartE2EDuration="2.458236468s" podCreationTimestamp="2026-04-17 17:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:07:45.457881099 +0000 UTC m=+3.530468285" watchObservedRunningTime="2026-04-17 17:07:45.458236468 +0000 UTC m=+3.530823657" Apr 17 17:07:45.459499 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.459475 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" event={"ID":"ac8a025c-d2a3-45bf-abc9-c3470d14e41a","Type":"ContainerStarted","Data":"0dea1f1b08cee541cf9a1349155ec7aa6e0085b6e07be85155466742f05a33b6"} Apr 17 17:07:45.460968 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.460942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4flxx" event={"ID":"26531bc0-743e-4e55-9c40-fe448eea598c","Type":"ContainerStarted","Data":"b55e0a532989979d7b755e755a5fb9c8267d9577f6740310225bccad591e985f"} Apr 17 17:07:45.462391 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.462364 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hrljp" event={"ID":"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62","Type":"ContainerStarted","Data":"4e272ac05e1855805f725263a5eb6206a8f3263426416727a6939cf801df51f9"} Apr 17 17:07:45.463707 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:45.463600 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerStarted","Data":"1eee7ff750d09c678477fa2f5a24351d5583bead11ec63382ef20eb4b9cef84e"} Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:46.050953 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:46.051010 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.051122 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.051180 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:48.05116287 +0000 UTC m=+6.123750049 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.051559 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.051578 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.051590 2574 projected.go:194] Error preparing data for projected volume kube-api-access-tkm7c for pod openshift-network-diagnostics/network-check-target-55bgp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:46.051698 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.051652 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c podName:94d54b6e-1e0b-4ee4-b498-c5e45fb87940 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:48.051638254 +0000 UTC m=+6.124225431 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-tkm7c" (UniqueName: "kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c") pod "network-check-target-55bgp" (UID: "94d54b6e-1e0b-4ee4-b498-c5e45fb87940") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:46.432700 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:46.432657 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:46.433154 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.432806 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:46.433213 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:46.433177 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:46.433279 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:46.433260 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:46.487794 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:46.487726 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" event={"ID":"ed5ab418bd0178840310111ee0f0df93","Type":"ContainerDied","Data":"22a8b00ccb389265cdeb67ad944bda146e4cce99c3df1bd3bbb8c109c9ce0981"} Apr 17 17:07:46.488284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:46.487596 2574 generic.go:358] "Generic (PLEG): container finished" podID="ed5ab418bd0178840310111ee0f0df93" containerID="22a8b00ccb389265cdeb67ad944bda146e4cce99c3df1bd3bbb8c109c9ce0981" exitCode=0 Apr 17 17:07:47.503891 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:47.503856 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" event={"ID":"ed5ab418bd0178840310111ee0f0df93","Type":"ContainerStarted","Data":"e2235a28ffbd4e92a4dfb0c96347fc211425a33eeca7fc866b646c37e82b0c53"} Apr 17 17:07:47.524063 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:47.524007 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-98.ec2.internal" podStartSLOduration=4.523986906 podStartE2EDuration="4.523986906s" podCreationTimestamp="2026-04-17 17:07:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:07:47.52387845 +0000 UTC m=+5.596465657" watchObservedRunningTime="2026-04-17 17:07:47.523986906 +0000 UTC m=+5.596574084" Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:48.066313 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:48.066394 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.066529 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.066545 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.066560 2574 projected.go:194] Error preparing data for projected volume kube-api-access-tkm7c for pod openshift-network-diagnostics/network-check-target-55bgp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.066628 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c podName:94d54b6e-1e0b-4ee4-b498-c5e45fb87940 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:52.066593589 +0000 UTC m=+10.139180764 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-tkm7c" (UniqueName: "kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c") pod "network-check-target-55bgp" (UID: "94d54b6e-1e0b-4ee4-b498-c5e45fb87940") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.067043 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:48.067160 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.067101 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:52.067085258 +0000 UTC m=+10.139672434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:48.434195 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:48.434161 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:48.434381 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.434300 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:48.434769 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:48.434644 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:48.434769 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:48.434721 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:50.432296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:50.432251 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:50.432792 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:50.432303 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:50.432792 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:50.432383 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:50.432792 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:50.432484 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:52.098912 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:52.098870 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:52.098912 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:52.098920 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:52.099381 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.099018 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:52.099381 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.099066 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.099052602 +0000 UTC m=+18.171639774 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:07:52.099381 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.099375 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:07:52.099483 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.099394 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:07:52.099483 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.099406 2574 projected.go:194] Error preparing data for projected volume kube-api-access-tkm7c for pod openshift-network-diagnostics/network-check-target-55bgp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:52.099483 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.099441 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c podName:94d54b6e-1e0b-4ee4-b498-c5e45fb87940 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.09943079 +0000 UTC m=+18.172017962 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-tkm7c" (UniqueName: "kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c") pod "network-check-target-55bgp" (UID: "94d54b6e-1e0b-4ee4-b498-c5e45fb87940") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:07:52.434597 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:52.434561 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:52.434786 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.434676 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:52.434786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:52.434732 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:52.434901 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:52.434804 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:53.166584 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.166550 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-mqsj4"] Apr 17 17:07:53.169383 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.169364 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.169510 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:53.169438 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:07:53.207874 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.207838 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3f7f5ebc-482a-4d47-b381-626eaf721f89-dbus\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.208030 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.207893 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.208030 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.207916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3f7f5ebc-482a-4d47-b381-626eaf721f89-kubelet-config\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.308246 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.308208 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.308400 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.308261 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3f7f5ebc-482a-4d47-b381-626eaf721f89-kubelet-config\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.308400 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:53.308327 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:53.308521 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.308404 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3f7f5ebc-482a-4d47-b381-626eaf721f89-kubelet-config\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.308521 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.308332 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3f7f5ebc-482a-4d47-b381-626eaf721f89-dbus\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.308521 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:53.308435 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret podName:3f7f5ebc-482a-4d47-b381-626eaf721f89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:53.808414248 +0000 UTC m=+11.881001424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret") pod "global-pull-secret-syncer-mqsj4" (UID: "3f7f5ebc-482a-4d47-b381-626eaf721f89") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:53.308521 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.308458 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3f7f5ebc-482a-4d47-b381-626eaf721f89-dbus\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.812974 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:53.812936 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:53.813135 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:53.813056 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:53.813135 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:53.813108 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret podName:3f7f5ebc-482a-4d47-b381-626eaf721f89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:54.813094727 +0000 UTC m=+12.885681899 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret") pod "global-pull-secret-syncer-mqsj4" (UID: "3f7f5ebc-482a-4d47-b381-626eaf721f89") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:54.431677 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:54.431642 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:54.431677 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:54.431680 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:54.432174 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:54.431772 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:54.432174 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:54.431883 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:54.819711 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:54.819631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:54.819862 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:54.819764 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:54.819862 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:54.819819 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret podName:3f7f5ebc-482a-4d47-b381-626eaf721f89 nodeName:}" failed. No retries permitted until 2026-04-17 17:07:56.819803599 +0000 UTC m=+14.892390772 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret") pod "global-pull-secret-syncer-mqsj4" (UID: "3f7f5ebc-482a-4d47-b381-626eaf721f89") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:55.431600 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:55.431566 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:55.431784 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:55.431696 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:07:56.432361 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:56.432334 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:56.432797 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:56.432329 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:56.432797 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:56.432559 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:56.432797 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:56.432430 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:56.833942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:56.833862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:56.834075 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:56.834026 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:56.834075 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:56.834072 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret podName:3f7f5ebc-482a-4d47-b381-626eaf721f89 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:00.834058698 +0000 UTC m=+18.906645870 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret") pod "global-pull-secret-syncer-mqsj4" (UID: "3f7f5ebc-482a-4d47-b381-626eaf721f89") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:07:57.432000 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:57.431969 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:57.432161 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:57.432072 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:07:58.432408 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:58.432357 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:07:58.432867 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:58.432358 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:07:58.432867 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:58.432500 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:07:58.432867 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:58.432530 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:07:59.432331 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:07:59.432292 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:07:59.432519 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:07:59.432420 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:00.156588 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:00.156550 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:00.156745 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:00.156628 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:00.156745 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.156722 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:08:00.156745 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.156732 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:00.156745 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.156743 2574 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:08:00.156938 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.156752 2574 projected.go:194] Error preparing data for projected volume kube-api-access-tkm7c for pod openshift-network-diagnostics/network-check-target-55bgp: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:00.156938 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.156796 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c podName:94d54b6e-1e0b-4ee4-b498-c5e45fb87940 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:16.15678254 +0000 UTC m=+34.229369712 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-tkm7c" (UniqueName: "kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c") pod "network-check-target-55bgp" (UID: "94d54b6e-1e0b-4ee4-b498-c5e45fb87940") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:08:00.156938 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.156810 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:16.156803414 +0000 UTC m=+34.229390586 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:08:00.434443 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:00.434413 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:00.434889 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:00.434427 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:00.434889 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.434522 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:00.434889 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.434618 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:00.861672 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:00.861572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:00.861823 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.861718 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:08:00.861823 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:00.861780 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret podName:3f7f5ebc-482a-4d47-b381-626eaf721f89 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:08.861762412 +0000 UTC m=+26.934349585 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret") pod "global-pull-secret-syncer-mqsj4" (UID: "3f7f5ebc-482a-4d47-b381-626eaf721f89") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:08:01.432494 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:01.432462 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:01.432669 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:01.432564 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:02.435633 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.433120 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:02.435633 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:02.433514 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:02.435633 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.433598 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:02.435633 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:02.433735 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:02.530326 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.530296 2574 generic.go:358] "Generic (PLEG): container finished" podID="aed051f5-a966-45e4-867e-3841c1814af1" containerID="3be24b6227f877c7cd825ca215a3cd49f10679e3843ff135bfb29bf43b472613" exitCode=0 Apr 17 17:08:02.530441 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.530357 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerDied","Data":"3be24b6227f877c7cd825ca215a3cd49f10679e3843ff135bfb29bf43b472613"} Apr 17 17:08:02.532160 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.532134 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-52mt7" event={"ID":"b5a9dd27-c914-41cc-88fc-5a64c1169c04","Type":"ContainerStarted","Data":"df7084a63b00cf940cea9ff3bc7aa15e7ba7e4cdf23cd009f3bffd04c6e7122e"} Apr 17 17:08:02.533405 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.533377 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bhrfj" event={"ID":"f40d551d-7b2b-4e50-afe2-fa8be6462803","Type":"ContainerStarted","Data":"6b0c90cc7db42c041873d14d5f5006ff7ce5e296fb727975236d0aadab9ad233"} Apr 17 17:08:02.535274 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.535249 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-2f5xx" event={"ID":"36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b","Type":"ContainerStarted","Data":"2a1fd644e79fee61aa7a6cc2b406734e9c1e126339ee57297dd6859419d76cc7"} Apr 17 17:08:02.537423 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.537358 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:08:02.537754 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.537734 2574 generic.go:358] "Generic (PLEG): container finished" podID="fb032f94-bfb7-47c8-b2bb-9e3c7a412058" containerID="4291dd41c0ed4eea85918b8a074f12ed06a8f24148261cd001ccb2625dd0a116" exitCode=1 Apr 17 17:08:02.537835 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.537794 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"9f450601421a3404c761383a8ad9a3e5d320e8a0eb8211889ef00cf9571a5bb3"} Apr 17 17:08:02.537835 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.537815 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"544683d3f81e6d486a49de19b886d3f7d9449253572cd7f69a72cb1023c42932"} Apr 17 17:08:02.537835 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.537829 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerDied","Data":"4291dd41c0ed4eea85918b8a074f12ed06a8f24148261cd001ccb2625dd0a116"} Apr 17 17:08:02.537994 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.537844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"f5ca8c30dc360046a933b8597f57abbe5cd0a32bba43e512b77aebac9ab585df"} Apr 17 17:08:02.539175 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.539152 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" event={"ID":"ac8a025c-d2a3-45bf-abc9-c3470d14e41a","Type":"ContainerStarted","Data":"ef30077bbbe49b47044563daf25bd5d2cba2665b5beb02d4671b0d24d89e1e5f"} Apr 17 17:08:02.540353 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.540330 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-4flxx" event={"ID":"26531bc0-743e-4e55-9c40-fe448eea598c","Type":"ContainerStarted","Data":"12b851a80d53113d701cf115942ae16e6917ca8efc24c6738ed3d85bd5632d81"} Apr 17 17:08:02.541658 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.541635 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-hrljp" event={"ID":"3be1e497-a7c7-4b5d-be1e-1fd8df6acc62","Type":"ContainerStarted","Data":"2128ab75b67d57df30f8c50be4b6528203b30b7f142684a285b0602e345ccf0a"} Apr 17 17:08:02.569061 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.568997 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-hrljp" podStartSLOduration=3.854417099 podStartE2EDuration="20.568978173s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.030574808 +0000 UTC m=+3.103161992" lastFinishedPulling="2026-04-17 17:08:01.745135893 +0000 UTC m=+19.817723066" observedRunningTime="2026-04-17 17:08:02.568436092 +0000 UTC m=+20.641023297" watchObservedRunningTime="2026-04-17 17:08:02.568978173 +0000 UTC m=+20.641565365" Apr 17 17:08:02.584685 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.584643 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-4flxx" podStartSLOduration=3.871353895 podStartE2EDuration="20.58462765s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.033140836 +0000 UTC m=+3.105728008" lastFinishedPulling="2026-04-17 17:08:01.746414588 +0000 UTC m=+19.819001763" observedRunningTime="2026-04-17 17:08:02.58421909 +0000 UTC m=+20.656806286" watchObservedRunningTime="2026-04-17 17:08:02.58462765 +0000 UTC m=+20.657214838" Apr 17 17:08:02.603729 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.603683 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bhrfj" podStartSLOduration=3.866350871 podStartE2EDuration="20.60366851s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.044555589 +0000 UTC m=+3.117142761" lastFinishedPulling="2026-04-17 17:08:01.78187321 +0000 UTC m=+19.854460400" observedRunningTime="2026-04-17 17:08:02.602878069 +0000 UTC m=+20.675465264" watchObservedRunningTime="2026-04-17 17:08:02.60366851 +0000 UTC m=+20.676255704" Apr 17 17:08:02.625236 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.625180 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-52mt7" podStartSLOduration=3.923546219 podStartE2EDuration="20.625164577s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.043060413 +0000 UTC m=+3.115647599" lastFinishedPulling="2026-04-17 17:08:01.744678762 +0000 UTC m=+19.817265957" observedRunningTime="2026-04-17 17:08:02.62445638 +0000 UTC m=+20.697043575" watchObservedRunningTime="2026-04-17 17:08:02.625164577 +0000 UTC m=+20.697751961" Apr 17 17:08:02.639370 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:02.639328 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-2f5xx" podStartSLOduration=8.317407144 podStartE2EDuration="20.639313716s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.03976686 +0000 UTC m=+3.112354039" lastFinishedPulling="2026-04-17 17:07:57.361673438 +0000 UTC m=+15.434260611" observedRunningTime="2026-04-17 17:08:02.639135898 +0000 UTC m=+20.711723092" watchObservedRunningTime="2026-04-17 17:08:02.639313716 +0000 UTC m=+20.711900909" Apr 17 17:08:03.262820 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.262675 2574 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:08:03.392467 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.392321 2574 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:08:03.262815694Z","UUID":"6814815e-61d4-454a-89f0-575cdabbdb0a","Handler":null,"Name":"","Endpoint":""} Apr 17 17:08:03.396560 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.394747 2574 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:08:03.396560 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.394799 2574 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:08:03.431895 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.431863 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:03.432072 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:03.432027 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:03.546662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.546631 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:08:03.547144 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.547118 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"cc3b527fc825d7eed5072dd5c94c3e4f678301cd2c01acdb68d2a9aefe43fc76"} Apr 17 17:08:03.547213 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.547157 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"7fa5be5e4942cecffe1fa30122ef5335bc2eaca8fe288d16c57b4ae38a39b7ac"} Apr 17 17:08:03.548835 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.548805 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" event={"ID":"ac8a025c-d2a3-45bf-abc9-c3470d14e41a","Type":"ContainerStarted","Data":"c41b47a061040e45f0af5e33e0d156740ec17f1a9de895b10dd73ed207f180a1"} Apr 17 17:08:03.550407 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.550359 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qkmhj" event={"ID":"10abc3c7-cda3-4034-9da8-0c4935e396af","Type":"ContainerStarted","Data":"dd71c0fd5225f60a53487b07536c5d6c418e934d3655b5cbb576488e74fa676c"} Apr 17 17:08:03.567037 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:03.566987 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qkmhj" podStartSLOduration=9.247713832 podStartE2EDuration="21.56696972s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.042569104 +0000 UTC m=+3.115156289" lastFinishedPulling="2026-04-17 17:07:57.361825004 +0000 UTC m=+15.434412177" observedRunningTime="2026-04-17 17:08:03.566398896 +0000 UTC m=+21.638986104" watchObservedRunningTime="2026-04-17 17:08:03.56696972 +0000 UTC m=+21.639556913" Apr 17 17:08:04.431590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:04.431556 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:04.431590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:04.431578 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:04.431867 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:04.431734 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:04.431930 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:04.431887 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:04.554145 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:04.554108 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" event={"ID":"ac8a025c-d2a3-45bf-abc9-c3470d14e41a","Type":"ContainerStarted","Data":"fd4dcc88d0fec814c510521240d19900ac5c86e366bbad4a86561056fa25e222"} Apr 17 17:08:04.573912 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:04.573867 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-gh4nl" podStartSLOduration=3.248966243 podStartE2EDuration="22.573854215s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.036895323 +0000 UTC m=+3.109482503" lastFinishedPulling="2026-04-17 17:08:04.361783287 +0000 UTC m=+22.434370475" observedRunningTime="2026-04-17 17:08:04.573770565 +0000 UTC m=+22.646357759" watchObservedRunningTime="2026-04-17 17:08:04.573854215 +0000 UTC m=+22.646441408" Apr 17 17:08:04.963982 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:04.963944 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:08:04.964692 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:04.964667 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:08:05.432083 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:05.432046 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:05.432272 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:05.432196 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:05.558843 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:05.558818 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:08:05.559271 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:05.559141 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"8739f40a92676b5f779aa3634b437c043d6a9db4b78a1d193ad591ba0bd6ba84"} Apr 17 17:08:05.559484 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:05.559457 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:08:05.559846 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:05.559830 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-hrljp" Apr 17 17:08:06.434109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:06.434076 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:06.434277 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:06.434191 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:06.434769 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:06.434751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:06.434880 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:06.434819 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:07.431655 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:07.431490 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:07.431655 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:07.431599 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:08.431744 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.431556 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:08.432245 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.431635 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:08.432245 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:08.431834 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:08.432245 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:08.431886 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:08.567401 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.567369 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:08:08.567788 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.567751 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"56dc1e7c0506c0a1b153de3f048406d822b555fefb7b4314b440ce718d15486e"} Apr 17 17:08:08.568012 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.567974 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:08:08.568012 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.568009 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:08:08.568247 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.568192 2574 scope.go:117] "RemoveContainer" containerID="4291dd41c0ed4eea85918b8a074f12ed06a8f24148261cd001ccb2625dd0a116" Apr 17 17:08:08.569535 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.569513 2574 generic.go:358] "Generic (PLEG): container finished" podID="aed051f5-a966-45e4-867e-3841c1814af1" containerID="d7e2e4f8d9e173826931f4d8bbd86fe681d6c5a0c0b2ac7a4ba731f837b3e3b1" exitCode=0 Apr 17 17:08:08.569655 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.569553 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerDied","Data":"d7e2e4f8d9e173826931f4d8bbd86fe681d6c5a0c0b2ac7a4ba731f837b3e3b1"} Apr 17 17:08:08.584476 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.584451 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:08:08.922914 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:08.922875 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:08.923079 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:08.923038 2574 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 17 17:08:08.923161 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:08.923110 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret podName:3f7f5ebc-482a-4d47-b381-626eaf721f89 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:24.923090104 +0000 UTC m=+42.995677284 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret") pod "global-pull-secret-syncer-mqsj4" (UID: "3f7f5ebc-482a-4d47-b381-626eaf721f89") : object "kube-system"/"original-pull-secret" not registered Apr 17 17:08:09.431886 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.431800 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:09.431886 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:09.431894 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:09.575483 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.575462 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:08:09.575866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.575836 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" event={"ID":"fb032f94-bfb7-47c8-b2bb-9e3c7a412058","Type":"ContainerStarted","Data":"9ed1d24ed6fc9559c53092852fc22588782e0323586b77a6bfb82d74d40b1ab9"} Apr 17 17:08:09.576112 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.576082 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:08:09.591749 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.591726 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:08:09.604949 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.604909 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" podStartSLOduration=10.559647265 podStartE2EDuration="27.604897467s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.038998575 +0000 UTC m=+3.111585761" lastFinishedPulling="2026-04-17 17:08:02.084248777 +0000 UTC m=+20.156835963" observedRunningTime="2026-04-17 17:08:09.603253938 +0000 UTC m=+27.675841203" watchObservedRunningTime="2026-04-17 17:08:09.604897467 +0000 UTC m=+27.677484659" Apr 17 17:08:09.846077 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.845897 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-55bgp"] Apr 17 17:08:09.846217 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.846185 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:09.846308 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:09.846289 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:09.848802 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.848778 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mqsj4"] Apr 17 17:08:09.848900 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.848863 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:09.848961 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:09.848943 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:09.849473 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.849452 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8zxs7"] Apr 17 17:08:09.849573 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:09.849560 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:09.849718 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:09.849689 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:10.579752 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:10.579720 2574 generic.go:358] "Generic (PLEG): container finished" podID="aed051f5-a966-45e4-867e-3841c1814af1" containerID="f9e33a29d22699a76fb5a9caba8ac40d2b3fd99c04b206de1cd646b82a32a6a8" exitCode=0 Apr 17 17:08:10.580114 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:10.579808 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerDied","Data":"f9e33a29d22699a76fb5a9caba8ac40d2b3fd99c04b206de1cd646b82a32a6a8"} Apr 17 17:08:11.432230 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:11.432199 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:11.432394 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:11.432259 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:11.432394 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:11.432352 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:11.432394 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:11.432347 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:11.432544 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:11.432428 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:11.432544 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:11.432479 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:12.585625 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:12.585562 2574 generic.go:358] "Generic (PLEG): container finished" podID="aed051f5-a966-45e4-867e-3841c1814af1" containerID="ddf3af9df5935ed34282a2e5d1ea99264a39aed089620aed069c79a52ab405e0" exitCode=0 Apr 17 17:08:12.586350 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:12.585649 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerDied","Data":"ddf3af9df5935ed34282a2e5d1ea99264a39aed089620aed069c79a52ab405e0"} Apr 17 17:08:13.431991 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:13.431950 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:13.432133 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:13.432001 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:13.432133 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:13.432055 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:13.432225 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:13.432180 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8zxs7" podUID="1dd8584b-d217-441a-a0d1-e1b86328dfe2" Apr 17 17:08:13.432286 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:13.432257 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-mqsj4" podUID="3f7f5ebc-482a-4d47-b381-626eaf721f89" Apr 17 17:08:13.432368 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:13.432350 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-55bgp" podUID="94d54b6e-1e0b-4ee4-b498-c5e45fb87940" Apr 17 17:08:14.795213 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.795139 2574 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-98.ec2.internal" event="NodeReady" Apr 17 17:08:14.795726 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.795272 2574 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:08:14.838984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.838949 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-m492r"] Apr 17 17:08:14.868269 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.868242 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-n4m6l"] Apr 17 17:08:14.868446 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.868425 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m492r" Apr 17 17:08:14.871119 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.871097 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:08:14.871240 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.871142 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qdqk7\"" Apr 17 17:08:14.871240 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.871220 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:08:14.900937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.900913 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m492r"] Apr 17 17:08:14.900937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.900938 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n4m6l"] Apr 17 17:08:14.901107 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.901035 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:14.903624 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.903592 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j45n5\"" Apr 17 17:08:14.903736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.903642 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:08:14.903736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.903592 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:08:14.903736 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.903651 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:08:14.964536 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.964495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mn5kv\" (UniqueName: \"kubernetes.io/projected/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-kube-api-access-mn5kv\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:14.964706 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.964565 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:14.964706 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.964627 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-config-volume\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:14.964706 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:14.964662 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-tmp-dir\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.065470 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065398 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mn5kv\" (UniqueName: \"kubernetes.io/projected/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-kube-api-access-mn5kv\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.065470 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065465 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.065687 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-config-volume\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.065687 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065539 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-tmp-dir\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.065687 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065567 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m98pn\" (UniqueName: \"kubernetes.io/projected/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-kube-api-access-m98pn\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:15.065687 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065595 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:15.065687 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.065656 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:15.065858 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.065742 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:08:15.565721642 +0000 UTC m=+33.638308814 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:15.065916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.065898 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-tmp-dir\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.066034 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.066017 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-config-volume\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.075766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.075743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mn5kv\" (UniqueName: \"kubernetes.io/projected/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-kube-api-access-mn5kv\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.166890 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.166859 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m98pn\" (UniqueName: \"kubernetes.io/projected/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-kube-api-access-m98pn\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:15.167030 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.166905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:15.167030 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.167023 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:15.167141 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.167111 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:15.667082161 +0000 UTC m=+33.739669337 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:15.179440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.179413 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m98pn\" (UniqueName: \"kubernetes.io/projected/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-kube-api-access-m98pn\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:15.431990 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.431950 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:15.432305 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.432061 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:15.432305 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.432078 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:15.434754 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.434728 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:08:15.434754 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.434744 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:08:15.434923 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.434739 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:08:15.435803 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.435785 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-s65fp\"" Apr 17 17:08:15.435914 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.435849 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mrqrv\"" Apr 17 17:08:15.435914 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.435872 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:08:15.570881 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.570843 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:15.571061 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.570995 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:15.571152 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.571140 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:08:16.571115089 +0000 UTC m=+34.643702281 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:15.672221 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:15.672191 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:15.672394 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.672336 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:15.672456 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:15.672397 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:16.672378983 +0000 UTC m=+34.744966156 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:16.176746 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.176711 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:16.177487 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.176815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:16.177487 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:16.176926 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:08:16.177487 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:16.176998 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:08:48.176978386 +0000 UTC m=+66.249565562 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : secret "metrics-daemon-secret" not found Apr 17 17:08:16.179777 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.179717 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkm7c\" (UniqueName: \"kubernetes.io/projected/94d54b6e-1e0b-4ee4-b498-c5e45fb87940-kube-api-access-tkm7c\") pod \"network-check-target-55bgp\" (UID: \"94d54b6e-1e0b-4ee4-b498-c5e45fb87940\") " pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:16.343619 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.343583 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:16.503021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.502769 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-55bgp"] Apr 17 17:08:16.580245 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.580214 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:16.580446 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:16.580362 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:16.580446 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:16.580437 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:08:18.58041913 +0000 UTC m=+36.653006319 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:16.680973 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:16.680934 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:16.681195 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:16.681108 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:16.681195 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:16.681190 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:18.681169343 +0000 UTC m=+36.753756517 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:18.535336 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:08:18.535295 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod94d54b6e_1e0b_4ee4_b498_c5e45fb87940.slice/crio-9b7e82c2826d361935e233b3927c6162c7ba3dd8e57d5b5d5da2f76b71a6fd13 WatchSource:0}: Error finding container 9b7e82c2826d361935e233b3927c6162c7ba3dd8e57d5b5d5da2f76b71a6fd13: Status 404 returned error can't find the container with id 9b7e82c2826d361935e233b3927c6162c7ba3dd8e57d5b5d5da2f76b71a6fd13 Apr 17 17:08:18.595320 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:18.595268 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:18.595489 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:18.595433 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:18.595545 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:18.595497 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:08:22.59547654 +0000 UTC m=+40.668063713 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:18.597561 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:18.597522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-55bgp" event={"ID":"94d54b6e-1e0b-4ee4-b498-c5e45fb87940","Type":"ContainerStarted","Data":"9b7e82c2826d361935e233b3927c6162c7ba3dd8e57d5b5d5da2f76b71a6fd13"} Apr 17 17:08:18.695972 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:18.695944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:18.696098 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:18.696081 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:18.696142 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:18.696138 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:22.696125192 +0000 UTC m=+40.768712364 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:19.601516 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:19.601333 2574 generic.go:358] "Generic (PLEG): container finished" podID="aed051f5-a966-45e4-867e-3841c1814af1" containerID="1ae11f6e69b8aec4cc1e71e563b3f9c095a1424c360d2db8f532ccf949cf8b6e" exitCode=0 Apr 17 17:08:19.601876 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:19.601411 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerDied","Data":"1ae11f6e69b8aec4cc1e71e563b3f9c095a1424c360d2db8f532ccf949cf8b6e"} Apr 17 17:08:20.606038 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:20.606000 2574 generic.go:358] "Generic (PLEG): container finished" podID="aed051f5-a966-45e4-867e-3841c1814af1" containerID="f7e39379828088cf1f657808433ca4b818054cda53e28b76090ad0c432fae87a" exitCode=0 Apr 17 17:08:20.606482 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:20.606060 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerDied","Data":"f7e39379828088cf1f657808433ca4b818054cda53e28b76090ad0c432fae87a"} Apr 17 17:08:21.613408 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:21.613374 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-9c486" event={"ID":"aed051f5-a966-45e4-867e-3841c1814af1","Type":"ContainerStarted","Data":"2689f6df84f88ecc90bae59911a6dd1bccc6959c3d7d6df1c4f03440c2b83ea3"} Apr 17 17:08:21.636827 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:21.636776 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-9c486" podStartSLOduration=6.082224412 podStartE2EDuration="39.636757556s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:07:45.045948134 +0000 UTC m=+3.118535307" lastFinishedPulling="2026-04-17 17:08:18.600481268 +0000 UTC m=+36.673068451" observedRunningTime="2026-04-17 17:08:21.635479529 +0000 UTC m=+39.708066737" watchObservedRunningTime="2026-04-17 17:08:21.636757556 +0000 UTC m=+39.709344750" Apr 17 17:08:22.616476 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:22.616444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-55bgp" event={"ID":"94d54b6e-1e0b-4ee4-b498-c5e45fb87940","Type":"ContainerStarted","Data":"69667bca469c52e147fad51558147e0e8a13ca74720dfc86bd42bc62bd44d122"} Apr 17 17:08:22.616886 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:22.616834 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:08:22.624937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:22.624918 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:22.625048 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:22.625035 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:22.625093 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:22.625083 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:08:30.625069987 +0000 UTC m=+48.697657160 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:22.631624 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:22.631563 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-55bgp" podStartSLOduration=37.453472802 podStartE2EDuration="40.631552859s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:08:18.537792847 +0000 UTC m=+36.610380019" lastFinishedPulling="2026-04-17 17:08:21.715872888 +0000 UTC m=+39.788460076" observedRunningTime="2026-04-17 17:08:22.631321347 +0000 UTC m=+40.703908537" watchObservedRunningTime="2026-04-17 17:08:22.631552859 +0000 UTC m=+40.704140046" Apr 17 17:08:22.726060 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:22.726028 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:22.726203 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:22.726159 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:22.726257 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:22.726218 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:30.726203141 +0000 UTC m=+48.798790325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:24.938470 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:24.938416 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:24.942513 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:24.942486 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3f7f5ebc-482a-4d47-b381-626eaf721f89-original-pull-secret\") pod \"global-pull-secret-syncer-mqsj4\" (UID: \"3f7f5ebc-482a-4d47-b381-626eaf721f89\") " pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:25.057015 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:25.056962 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-mqsj4" Apr 17 17:08:25.177453 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:25.177424 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-mqsj4"] Apr 17 17:08:25.180782 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:08:25.180755 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f7f5ebc_482a_4d47_b381_626eaf721f89.slice/crio-a8b9bf814b884f4c5c3d4802c63a34feb06d896c05f030da8369fccf3e1e5cb5 WatchSource:0}: Error finding container a8b9bf814b884f4c5c3d4802c63a34feb06d896c05f030da8369fccf3e1e5cb5: Status 404 returned error can't find the container with id a8b9bf814b884f4c5c3d4802c63a34feb06d896c05f030da8369fccf3e1e5cb5 Apr 17 17:08:25.622181 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:25.622148 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mqsj4" event={"ID":"3f7f5ebc-482a-4d47-b381-626eaf721f89","Type":"ContainerStarted","Data":"a8b9bf814b884f4c5c3d4802c63a34feb06d896c05f030da8369fccf3e1e5cb5"} Apr 17 17:08:29.631570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:29.631515 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-mqsj4" event={"ID":"3f7f5ebc-482a-4d47-b381-626eaf721f89","Type":"ContainerStarted","Data":"8c6e805a44e90ba81e9c5ac1ab9586cdf7dc7cbc1c66c7c3cc745874e62416dc"} Apr 17 17:08:29.646019 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:29.645968 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-mqsj4" podStartSLOduration=33.05974031 podStartE2EDuration="36.64595527s" podCreationTimestamp="2026-04-17 17:07:53 +0000 UTC" firstStartedPulling="2026-04-17 17:08:25.182421048 +0000 UTC m=+43.255008220" lastFinishedPulling="2026-04-17 17:08:28.768635998 +0000 UTC m=+46.841223180" observedRunningTime="2026-04-17 17:08:29.645084236 +0000 UTC m=+47.717671430" watchObservedRunningTime="2026-04-17 17:08:29.64595527 +0000 UTC m=+47.718542464" Apr 17 17:08:30.678747 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:30.678708 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:30.679081 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:30.678847 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:30.679081 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:30.678905 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:08:46.67889025 +0000 UTC m=+64.751477422 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:30.779941 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:30.779912 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:30.780082 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:30.780046 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:30.780119 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:30.780105 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:08:46.780088902 +0000 UTC m=+64.852676075 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:41.596830 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:41.596803 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vbndw" Apr 17 17:08:46.680472 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:46.680441 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:08:46.680866 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:46.680545 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:08:46.680866 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:46.680594 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:09:18.680581473 +0000 UTC m=+96.753168645 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:08:46.781722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:46.781694 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:08:46.781847 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:46.781828 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:08:46.781900 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:46.781891 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:09:18.781877006 +0000 UTC m=+96.854464178 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:08:48.190325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:48.190276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:08:48.190722 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:48.190414 2574 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 17 17:08:48.190722 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:08:48.190476 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs podName:1dd8584b-d217-441a-a0d1-e1b86328dfe2 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:52.190461813 +0000 UTC m=+130.263048986 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs") pod "network-metrics-daemon-8zxs7" (UID: "1dd8584b-d217-441a-a0d1-e1b86328dfe2") : secret "metrics-daemon-secret" not found Apr 17 17:08:54.622740 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:08:54.622637 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-55bgp" Apr 17 17:09:18.701453 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:18.701413 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:09:18.701976 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:18.701557 2574 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:09:18.701976 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:18.701641 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls podName:5ec73ca8-f486-4ad3-b44e-4e1b5815d3de nodeName:}" failed. No retries permitted until 2026-04-17 17:10:22.701624917 +0000 UTC m=+160.774212107 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls") pod "dns-default-m492r" (UID: "5ec73ca8-f486-4ad3-b44e-4e1b5815d3de") : secret "dns-default-metrics-tls" not found Apr 17 17:09:18.802535 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:18.802500 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:09:18.802707 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:18.802621 2574 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:09:18.802707 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:18.802680 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert podName:05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b nodeName:}" failed. No retries permitted until 2026-04-17 17:10:22.80266687 +0000 UTC m=+160.875254042 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert") pod "ingress-canary-n4m6l" (UID: "05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b") : secret "canary-serving-cert" not found Apr 17 17:09:25.361987 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.361949 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8hcsn"] Apr 17 17:09:25.364831 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.364811 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.367822 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.367804 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:09:25.367940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.367802 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 17 17:09:25.367940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.367856 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-5xpxt\"" Apr 17 17:09:25.367940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.367875 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 17 17:09:25.369000 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.368983 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:09:25.373335 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.373318 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 17 17:09:25.375845 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.375827 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8hcsn"] Apr 17 17:09:25.446081 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.446047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-snapshots\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.446220 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.446103 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.446220 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.446146 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-service-ca-bundle\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.446220 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.446203 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-serving-cert\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.446317 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.446226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-tmp\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.446317 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.446242 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9src7\" (UniqueName: \"kubernetes.io/projected/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-kube-api-access-9src7\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547199 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547168 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-service-ca-bundle\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547340 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-serving-cert\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547340 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547280 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-tmp\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547340 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547306 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9src7\" (UniqueName: \"kubernetes.io/projected/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-kube-api-access-9src7\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547471 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547349 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-snapshots\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547471 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547385 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547757 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547723 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-tmp\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.547916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.547892 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-service-ca-bundle\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.548488 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.548469 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-snapshots\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.548633 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.548595 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.549800 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.549779 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-serving-cert\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.555451 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.555431 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9src7\" (UniqueName: \"kubernetes.io/projected/bfd868a7-0269-499a-9b8c-4c2c8d2aba93-kube-api-access-9src7\") pod \"insights-operator-585dfdc468-8hcsn\" (UID: \"bfd868a7-0269-499a-9b8c-4c2c8d2aba93\") " pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.673707 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.673682 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" Apr 17 17:09:25.782245 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:25.782216 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-8hcsn"] Apr 17 17:09:25.786323 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:09:25.786290 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfd868a7_0269_499a_9b8c_4c2c8d2aba93.slice/crio-39915c8a1ab27442089a96f8bed384d6cb571ab50a777244ec45a337f7051507 WatchSource:0}: Error finding container 39915c8a1ab27442089a96f8bed384d6cb571ab50a777244ec45a337f7051507: Status 404 returned error can't find the container with id 39915c8a1ab27442089a96f8bed384d6cb571ab50a777244ec45a337f7051507 Apr 17 17:09:26.736000 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:26.735957 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" event={"ID":"bfd868a7-0269-499a-9b8c-4c2c8d2aba93","Type":"ContainerStarted","Data":"39915c8a1ab27442089a96f8bed384d6cb571ab50a777244ec45a337f7051507"} Apr 17 17:09:28.740862 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:28.740824 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" event={"ID":"bfd868a7-0269-499a-9b8c-4c2c8d2aba93","Type":"ContainerStarted","Data":"6312cd6dcd9e22218113640e1e4db9bea8fa3d33e3a7d08e35d1289e20db88ef"} Apr 17 17:09:28.756840 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:28.756789 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" podStartSLOduration=1.837535103 podStartE2EDuration="3.756774264s" podCreationTimestamp="2026-04-17 17:09:25 +0000 UTC" firstStartedPulling="2026-04-17 17:09:25.788554805 +0000 UTC m=+103.861141980" lastFinishedPulling="2026-04-17 17:09:27.707793955 +0000 UTC m=+105.780381141" observedRunningTime="2026-04-17 17:09:28.756286129 +0000 UTC m=+106.828873325" watchObservedRunningTime="2026-04-17 17:09:28.756774264 +0000 UTC m=+106.829361458" Apr 17 17:09:29.225779 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.225742 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kvtg4"] Apr 17 17:09:29.228646 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.228631 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.231301 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.231281 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:09:29.231301 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.231296 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-4vhgf\"" Apr 17 17:09:29.231475 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.231316 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 17 17:09:29.232514 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.232495 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 17 17:09:29.232514 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.232505 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 17 17:09:29.236673 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.236655 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 17 17:09:29.239870 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.239847 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kvtg4"] Apr 17 17:09:29.275384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.275360 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-trusted-ca\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.275489 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.275391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-serving-cert\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.275489 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.275422 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kk9sq\" (UniqueName: \"kubernetes.io/projected/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-kube-api-access-kk9sq\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.275489 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.275484 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-config\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.376563 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.376531 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-trusted-ca\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.376719 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.376566 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-serving-cert\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.376719 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.376595 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kk9sq\" (UniqueName: \"kubernetes.io/projected/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-kube-api-access-kk9sq\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.376719 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.376631 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-config\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.377305 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.377287 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-config\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.377418 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.377400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-trusted-ca\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.378964 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.378948 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-serving-cert\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.385959 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.385932 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kk9sq\" (UniqueName: \"kubernetes.io/projected/81fbead9-b55d-41ce-9182-c35ba0bc0eb2-kube-api-access-kk9sq\") pod \"console-operator-9d4b6777b-kvtg4\" (UID: \"81fbead9-b55d-41ce-9182-c35ba0bc0eb2\") " pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.537448 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.537376 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:29.659830 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.659795 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-kvtg4"] Apr 17 17:09:29.663480 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:09:29.663443 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fbead9_b55d_41ce_9182_c35ba0bc0eb2.slice/crio-2ed2efa95421fa6dca44fef1bfe0f98c73af45e496033de8648d103db496f428 WatchSource:0}: Error finding container 2ed2efa95421fa6dca44fef1bfe0f98c73af45e496033de8648d103db496f428: Status 404 returned error can't find the container with id 2ed2efa95421fa6dca44fef1bfe0f98c73af45e496033de8648d103db496f428 Apr 17 17:09:29.743786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:29.743754 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" event={"ID":"81fbead9-b55d-41ce-9182-c35ba0bc0eb2","Type":"ContainerStarted","Data":"2ed2efa95421fa6dca44fef1bfe0f98c73af45e496033de8648d103db496f428"} Apr 17 17:09:31.125093 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.125057 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5cd9d6769f-6448l"] Apr 17 17:09:31.129383 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.129360 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.132968 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.132839 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:09:31.132968 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.132858 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b4xpj\"" Apr 17 17:09:31.132968 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.132936 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:09:31.133219 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.133173 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:09:31.138363 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.138330 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:09:31.141646 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.141623 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cd9d6769f-6448l"] Apr 17 17:09:31.192423 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192383 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-certificates\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192435 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-image-registry-private-configuration\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192454 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-ca-trust-extracted\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192491 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-trusted-ca\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192562 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-bound-sa-token\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192802 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192625 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzc84\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-kube-api-access-dzc84\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192802 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192654 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.192802 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.192673 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-installation-pull-secrets\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.293621 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293574 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dzc84\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-kube-api-access-dzc84\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.293795 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293661 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.293795 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293695 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-installation-pull-secrets\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.293795 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293765 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-certificates\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.293931 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293792 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-image-registry-private-configuration\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.293931 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-ca-trust-extracted\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.294029 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:31.293830 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:31.294029 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:31.293961 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd9d6769f-6448l: secret "image-registry-tls" not found Apr 17 17:09:31.294029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.293841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-trusted-ca\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.294169 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:31.294038 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls podName:4adf7a08-e39c-4ebd-9fb8-8989bbdd8960 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:31.794016659 +0000 UTC m=+109.866603850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls") pod "image-registry-5cd9d6769f-6448l" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960") : secret "image-registry-tls" not found Apr 17 17:09:31.294169 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.294100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-bound-sa-token\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.294410 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.294383 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-certificates\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.294589 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.294570 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-ca-trust-extracted\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.295324 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.295301 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-trusted-ca\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.296837 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.296817 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-installation-pull-secrets\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.296959 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.296939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-image-registry-private-configuration\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.303253 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.303230 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzc84\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-kube-api-access-dzc84\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.303532 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.303513 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-bound-sa-token\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.585060 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.585040 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-52mt7_b5a9dd27-c914-41cc-88fc-5a64c1169c04/dns-node-resolver/0.log" Apr 17 17:09:31.749138 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.749112 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/0.log" Apr 17 17:09:31.749258 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.749153 2574 generic.go:358] "Generic (PLEG): container finished" podID="81fbead9-b55d-41ce-9182-c35ba0bc0eb2" containerID="ee9ba46ac05ce28b9469585529fe678043a223b3b35416fa570542314f976009" exitCode=255 Apr 17 17:09:31.749258 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.749186 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" event={"ID":"81fbead9-b55d-41ce-9182-c35ba0bc0eb2","Type":"ContainerDied","Data":"ee9ba46ac05ce28b9469585529fe678043a223b3b35416fa570542314f976009"} Apr 17 17:09:31.749412 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.749398 2574 scope.go:117] "RemoveContainer" containerID="ee9ba46ac05ce28b9469585529fe678043a223b3b35416fa570542314f976009" Apr 17 17:09:31.798886 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:31.798862 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:31.799011 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:31.798993 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:31.799052 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:31.799013 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd9d6769f-6448l: secret "image-registry-tls" not found Apr 17 17:09:31.799090 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:31.799068 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls podName:4adf7a08-e39c-4ebd-9fb8-8989bbdd8960 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:32.799054675 +0000 UTC m=+110.871641847 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls") pod "image-registry-5cd9d6769f-6448l" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960") : secret "image-registry-tls" not found Apr 17 17:09:32.184188 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.184161 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2f5xx_36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b/node-ca/0.log" Apr 17 17:09:32.753673 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.753644 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:09:32.754027 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.754010 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/0.log" Apr 17 17:09:32.754081 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.754046 2574 generic.go:358] "Generic (PLEG): container finished" podID="81fbead9-b55d-41ce-9182-c35ba0bc0eb2" containerID="d9fd24f360481b014d44b1c1ed8aa7f969c69a8028ae3a222b3d8cac8cf511bd" exitCode=255 Apr 17 17:09:32.754118 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.754090 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" event={"ID":"81fbead9-b55d-41ce-9182-c35ba0bc0eb2","Type":"ContainerDied","Data":"d9fd24f360481b014d44b1c1ed8aa7f969c69a8028ae3a222b3d8cac8cf511bd"} Apr 17 17:09:32.754165 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.754117 2574 scope.go:117] "RemoveContainer" containerID="ee9ba46ac05ce28b9469585529fe678043a223b3b35416fa570542314f976009" Apr 17 17:09:32.754395 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.754378 2574 scope.go:117] "RemoveContainer" containerID="d9fd24f360481b014d44b1c1ed8aa7f969c69a8028ae3a222b3d8cac8cf511bd" Apr 17 17:09:32.754572 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:32.754554 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kvtg4_openshift-console-operator(81fbead9-b55d-41ce-9182-c35ba0bc0eb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" podUID="81fbead9-b55d-41ce-9182-c35ba0bc0eb2" Apr 17 17:09:32.806180 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:32.806143 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:32.806310 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:32.806291 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:32.806310 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:32.806308 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd9d6769f-6448l: secret "image-registry-tls" not found Apr 17 17:09:32.806389 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:32.806369 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls podName:4adf7a08-e39c-4ebd-9fb8-8989bbdd8960 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:34.806355188 +0000 UTC m=+112.878942361 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls") pod "image-registry-5cd9d6769f-6448l" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960") : secret "image-registry-tls" not found Apr 17 17:09:33.757754 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:33.757726 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:09:33.758116 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:33.758029 2574 scope.go:117] "RemoveContainer" containerID="d9fd24f360481b014d44b1c1ed8aa7f969c69a8028ae3a222b3d8cac8cf511bd" Apr 17 17:09:33.758195 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:33.758178 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kvtg4_openshift-console-operator(81fbead9-b55d-41ce-9182-c35ba0bc0eb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" podUID="81fbead9-b55d-41ce-9182-c35ba0bc0eb2" Apr 17 17:09:34.825647 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:34.825594 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:34.826005 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:34.825740 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:34.826005 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:34.825758 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd9d6769f-6448l: secret "image-registry-tls" not found Apr 17 17:09:34.826005 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:34.825811 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls podName:4adf7a08-e39c-4ebd-9fb8-8989bbdd8960 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:38.825797458 +0000 UTC m=+116.898384630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls") pod "image-registry-5cd9d6769f-6448l" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960") : secret "image-registry-tls" not found Apr 17 17:09:36.830228 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.830199 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn"] Apr 17 17:09:36.834207 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.834192 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" Apr 17 17:09:36.837407 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.837385 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 17:09:36.837528 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.837416 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 17:09:36.838525 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.838510 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-wxhrw\"" Apr 17 17:09:36.846490 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.846467 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn"] Apr 17 17:09:36.941600 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:36.941566 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd8t8\" (UniqueName: \"kubernetes.io/projected/ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e-kube-api-access-vd8t8\") pod \"migrator-74bb7799d9-n49zn\" (UID: \"ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" Apr 17 17:09:37.042349 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:37.042315 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd8t8\" (UniqueName: \"kubernetes.io/projected/ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e-kube-api-access-vd8t8\") pod \"migrator-74bb7799d9-n49zn\" (UID: \"ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" Apr 17 17:09:37.050561 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:37.050532 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd8t8\" (UniqueName: \"kubernetes.io/projected/ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e-kube-api-access-vd8t8\") pod \"migrator-74bb7799d9-n49zn\" (UID: \"ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" Apr 17 17:09:37.142187 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:37.142117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" Apr 17 17:09:37.254809 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:37.254780 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn"] Apr 17 17:09:37.258791 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:09:37.258762 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea9e3b3c_f443_44c5_bbb3_fdac356e6b2e.slice/crio-9cd2189d15fca0a8f422929a763f51d4bf3dc911b169376eedcfdf80ae90c9f2 WatchSource:0}: Error finding container 9cd2189d15fca0a8f422929a763f51d4bf3dc911b169376eedcfdf80ae90c9f2: Status 404 returned error can't find the container with id 9cd2189d15fca0a8f422929a763f51d4bf3dc911b169376eedcfdf80ae90c9f2 Apr 17 17:09:37.766243 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:37.766210 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" event={"ID":"ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e","Type":"ContainerStarted","Data":"9cd2189d15fca0a8f422929a763f51d4bf3dc911b169376eedcfdf80ae90c9f2"} Apr 17 17:09:38.770282 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:38.770205 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" event={"ID":"ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e","Type":"ContainerStarted","Data":"8551294f27a374e878c784109fccf97d6f26fc90afbf5449c099e56ffc66373c"} Apr 17 17:09:38.770282 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:38.770239 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" event={"ID":"ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e","Type":"ContainerStarted","Data":"0d4b75481c8dfeca1dde09111b0c6e11f7371bb35948d4de73e0da836124432a"} Apr 17 17:09:38.789329 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:38.789286 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-n49zn" podStartSLOduration=1.6293309329999999 podStartE2EDuration="2.789249497s" podCreationTimestamp="2026-04-17 17:09:36 +0000 UTC" firstStartedPulling="2026-04-17 17:09:37.260489575 +0000 UTC m=+115.333076747" lastFinishedPulling="2026-04-17 17:09:38.420408125 +0000 UTC m=+116.492995311" observedRunningTime="2026-04-17 17:09:38.788175704 +0000 UTC m=+116.860762896" watchObservedRunningTime="2026-04-17 17:09:38.789249497 +0000 UTC m=+116.861836688" Apr 17 17:09:38.857251 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:38.857227 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:38.857360 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:38.857326 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:38.857360 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:38.857336 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd9d6769f-6448l: secret "image-registry-tls" not found Apr 17 17:09:38.857433 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:38.857382 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls podName:4adf7a08-e39c-4ebd-9fb8-8989bbdd8960 nodeName:}" failed. No retries permitted until 2026-04-17 17:09:46.857370328 +0000 UTC m=+124.929957500 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls") pod "image-registry-5cd9d6769f-6448l" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960") : secret "image-registry-tls" not found Apr 17 17:09:39.538454 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:39.538421 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:39.538454 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:39.538451 2574 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:39.538851 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:39.538835 2574 scope.go:117] "RemoveContainer" containerID="d9fd24f360481b014d44b1c1ed8aa7f969c69a8028ae3a222b3d8cac8cf511bd" Apr 17 17:09:39.539010 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:39.538994 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-kvtg4_openshift-console-operator(81fbead9-b55d-41ce-9182-c35ba0bc0eb2)\"" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" podUID="81fbead9-b55d-41ce-9182-c35ba0bc0eb2" Apr 17 17:09:41.148198 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.148164 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j82jv"] Apr 17 17:09:41.150964 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.150949 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.154703 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.154684 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 17:09:41.154893 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.154881 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 17:09:41.155786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.155761 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-4n7tl\"" Apr 17 17:09:41.155956 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.155809 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 17:09:41.155956 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.155871 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 17:09:41.167980 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.167957 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j82jv"] Apr 17 17:09:41.276290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.276260 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1efcfd5b-f268-4d11-8b36-5123399f5ee3-signing-key\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.276290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.276292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1efcfd5b-f268-4d11-8b36-5123399f5ee3-signing-cabundle\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.276482 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.276330 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-686z9\" (UniqueName: \"kubernetes.io/projected/1efcfd5b-f268-4d11-8b36-5123399f5ee3-kube-api-access-686z9\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.377155 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.377112 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-686z9\" (UniqueName: \"kubernetes.io/projected/1efcfd5b-f268-4d11-8b36-5123399f5ee3-kube-api-access-686z9\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.377310 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.377228 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1efcfd5b-f268-4d11-8b36-5123399f5ee3-signing-key\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.377310 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.377262 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1efcfd5b-f268-4d11-8b36-5123399f5ee3-signing-cabundle\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.378479 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.378455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/1efcfd5b-f268-4d11-8b36-5123399f5ee3-signing-cabundle\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.379811 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.379792 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/1efcfd5b-f268-4d11-8b36-5123399f5ee3-signing-key\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.385646 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.385627 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-686z9\" (UniqueName: \"kubernetes.io/projected/1efcfd5b-f268-4d11-8b36-5123399f5ee3-kube-api-access-686z9\") pod \"service-ca-865cb79987-j82jv\" (UID: \"1efcfd5b-f268-4d11-8b36-5123399f5ee3\") " pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.459053 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.459031 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-j82jv" Apr 17 17:09:41.573232 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.573203 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-j82jv"] Apr 17 17:09:41.576437 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:09:41.576409 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1efcfd5b_f268_4d11_8b36_5123399f5ee3.slice/crio-a5ec06f98d7dcfd2911a8f71d338f112445cc09eb2cff8623876141ad0f8413a WatchSource:0}: Error finding container a5ec06f98d7dcfd2911a8f71d338f112445cc09eb2cff8623876141ad0f8413a: Status 404 returned error can't find the container with id a5ec06f98d7dcfd2911a8f71d338f112445cc09eb2cff8623876141ad0f8413a Apr 17 17:09:41.780032 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:41.779949 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j82jv" event={"ID":"1efcfd5b-f268-4d11-8b36-5123399f5ee3","Type":"ContainerStarted","Data":"a5ec06f98d7dcfd2911a8f71d338f112445cc09eb2cff8623876141ad0f8413a"} Apr 17 17:09:43.787051 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:43.787014 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-j82jv" event={"ID":"1efcfd5b-f268-4d11-8b36-5123399f5ee3","Type":"ContainerStarted","Data":"4f7766d4de6535349e85c5809a5e92792123217cb8379b5ab95ababe72db374a"} Apr 17 17:09:43.802304 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:43.802244 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-j82jv" podStartSLOduration=1.178035306 podStartE2EDuration="2.802228692s" podCreationTimestamp="2026-04-17 17:09:41 +0000 UTC" firstStartedPulling="2026-04-17 17:09:41.578392605 +0000 UTC m=+119.650979777" lastFinishedPulling="2026-04-17 17:09:43.202585989 +0000 UTC m=+121.275173163" observedRunningTime="2026-04-17 17:09:43.80215606 +0000 UTC m=+121.874743253" watchObservedRunningTime="2026-04-17 17:09:43.802228692 +0000 UTC m=+121.874815888" Apr 17 17:09:46.919458 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:46.919420 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:09:46.919833 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:46.919536 2574 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 17 17:09:46.919833 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:46.919548 2574 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-5cd9d6769f-6448l: secret "image-registry-tls" not found Apr 17 17:09:46.919833 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:09:46.919601 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls podName:4adf7a08-e39c-4ebd-9fb8-8989bbdd8960 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:02.91958569 +0000 UTC m=+140.992172879 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls") pod "image-registry-5cd9d6769f-6448l" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960") : secret "image-registry-tls" not found Apr 17 17:09:52.263848 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:52.263802 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:09:52.266190 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:52.266160 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1dd8584b-d217-441a-a0d1-e1b86328dfe2-metrics-certs\") pod \"network-metrics-daemon-8zxs7\" (UID: \"1dd8584b-d217-441a-a0d1-e1b86328dfe2\") " pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:09:52.355634 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:52.355595 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mrqrv\"" Apr 17 17:09:52.361121 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:52.361092 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8zxs7" Apr 17 17:09:52.478704 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:52.478672 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8zxs7"] Apr 17 17:09:52.481658 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:09:52.481632 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dd8584b_d217_441a_a0d1_e1b86328dfe2.slice/crio-5b4edc7af209a13b7e43c1c2a24c74997afb914c58c303d4b099b4b95b8a28a5 WatchSource:0}: Error finding container 5b4edc7af209a13b7e43c1c2a24c74997afb914c58c303d4b099b4b95b8a28a5: Status 404 returned error can't find the container with id 5b4edc7af209a13b7e43c1c2a24c74997afb914c58c303d4b099b4b95b8a28a5 Apr 17 17:09:52.809108 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:52.809069 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8zxs7" event={"ID":"1dd8584b-d217-441a-a0d1-e1b86328dfe2","Type":"ContainerStarted","Data":"5b4edc7af209a13b7e43c1c2a24c74997afb914c58c303d4b099b4b95b8a28a5"} Apr 17 17:09:54.815701 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:54.815667 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8zxs7" event={"ID":"1dd8584b-d217-441a-a0d1-e1b86328dfe2","Type":"ContainerStarted","Data":"67b85c43106a897ac1a4f25772e6bf476ce5a81f920170d54668bd355f0a6b5c"} Apr 17 17:09:54.815701 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:54.815701 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8zxs7" event={"ID":"1dd8584b-d217-441a-a0d1-e1b86328dfe2","Type":"ContainerStarted","Data":"bbdca932f8af96c6ddf5650119ecd199c6fc9e642d755dd7e558f8d2c1a52f3c"} Apr 17 17:09:54.831078 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:54.831033 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8zxs7" podStartSLOduration=131.490266824 podStartE2EDuration="2m12.831021901s" podCreationTimestamp="2026-04-17 17:07:42 +0000 UTC" firstStartedPulling="2026-04-17 17:09:52.484154681 +0000 UTC m=+130.556741870" lastFinishedPulling="2026-04-17 17:09:53.824909775 +0000 UTC m=+131.897496947" observedRunningTime="2026-04-17 17:09:54.830218096 +0000 UTC m=+132.902805292" watchObservedRunningTime="2026-04-17 17:09:54.831021901 +0000 UTC m=+132.903609093" Apr 17 17:09:55.431993 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:55.431965 2574 scope.go:117] "RemoveContainer" containerID="d9fd24f360481b014d44b1c1ed8aa7f969c69a8028ae3a222b3d8cac8cf511bd" Apr 17 17:09:55.819690 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:55.819598 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:09:55.820042 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:55.819741 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" event={"ID":"81fbead9-b55d-41ce-9182-c35ba0bc0eb2","Type":"ContainerStarted","Data":"cf18ddc3adefd860e10148c8c62ae9b7e7d78184af03b67f2a5f0e19dbf171a2"} Apr 17 17:09:55.820169 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:55.820146 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:09:55.836479 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:55.836433 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" podStartSLOduration=24.950028029 podStartE2EDuration="26.836419131s" podCreationTimestamp="2026-04-17 17:09:29 +0000 UTC" firstStartedPulling="2026-04-17 17:09:29.665882403 +0000 UTC m=+107.738469575" lastFinishedPulling="2026-04-17 17:09:31.552273497 +0000 UTC m=+109.624860677" observedRunningTime="2026-04-17 17:09:55.835839079 +0000 UTC m=+133.908426275" watchObservedRunningTime="2026-04-17 17:09:55.836419131 +0000 UTC m=+133.909006324" Apr 17 17:09:56.820621 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:56.820557 2574 patch_prober.go:28] interesting pod/console-operator-9d4b6777b-kvtg4 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.133.0.9:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 17 17:09:56.821069 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:56.820661 2574 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" podUID="81fbead9-b55d-41ce-9182-c35ba0bc0eb2" containerName="console-operator" probeResult="failure" output="Get \"https://10.133.0.9:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 17 17:09:57.045877 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:09:57.045829 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-kvtg4" Apr 17 17:10:02.943837 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:02.943791 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:02.946346 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:02.946318 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"image-registry-5cd9d6769f-6448l\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:03.244458 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.244383 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-b4xpj\"" Apr 17 17:10:03.251656 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.251636 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:03.367734 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.367684 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5cd9d6769f-6448l"] Apr 17 17:10:03.370544 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:03.370513 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4adf7a08_e39c_4ebd_9fb8_8989bbdd8960.slice/crio-96b88ef1dd02594ddbf1a7d1a3990b31efa6d236ab32f9567056fb85e2820a01 WatchSource:0}: Error finding container 96b88ef1dd02594ddbf1a7d1a3990b31efa6d236ab32f9567056fb85e2820a01: Status 404 returned error can't find the container with id 96b88ef1dd02594ddbf1a7d1a3990b31efa6d236ab32f9567056fb85e2820a01 Apr 17 17:10:03.840044 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.840009 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" event={"ID":"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960","Type":"ContainerStarted","Data":"ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5"} Apr 17 17:10:03.840044 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.840044 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" event={"ID":"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960","Type":"ContainerStarted","Data":"96b88ef1dd02594ddbf1a7d1a3990b31efa6d236ab32f9567056fb85e2820a01"} Apr 17 17:10:03.840242 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.840136 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:03.859751 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:03.859704 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" podStartSLOduration=32.859691124 podStartE2EDuration="32.859691124s" podCreationTimestamp="2026-04-17 17:09:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:10:03.858785236 +0000 UTC m=+141.931372440" watchObservedRunningTime="2026-04-17 17:10:03.859691124 +0000 UTC m=+141.932278316" Apr 17 17:10:06.407524 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.407487 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cd9d6769f-6448l"] Apr 17 17:10:06.425332 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.425307 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-6mgfv"] Apr 17 17:10:06.430141 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.430117 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:06.433456 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.433434 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-66vfz\"" Apr 17 17:10:06.433690 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.433671 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:10:06.433946 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.433930 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:10:06.435069 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.435046 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-s6zkr"] Apr 17 17:10:06.438536 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.438515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.439895 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.439873 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6mgfv"] Apr 17 17:10:06.441002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.440957 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-hg6xj\"" Apr 17 17:10:06.441148 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.441127 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:10:06.441216 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.441172 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:10:06.456165 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.456145 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s6zkr"] Apr 17 17:10:06.469394 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.469355 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d78bb467-37d0-4129-b59e-e8587a2b8ca8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.469493 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.469407 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d78bb467-37d0-4129-b59e-e8587a2b8ca8-data-volume\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.469545 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.469531 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d78bb467-37d0-4129-b59e-e8587a2b8ca8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.469579 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.469554 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d78bb467-37d0-4129-b59e-e8587a2b8ca8-crio-socket\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.469643 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.469577 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blntx\" (UniqueName: \"kubernetes.io/projected/998f90ca-500c-4933-b022-c7fdd401c2e3-kube-api-access-blntx\") pod \"downloads-6bcc868b7-6mgfv\" (UID: \"998f90ca-500c-4933-b022-c7fdd401c2e3\") " pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:06.469643 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.469624 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5cf5\" (UniqueName: \"kubernetes.io/projected/d78bb467-37d0-4129-b59e-e8587a2b8ca8-kube-api-access-h5cf5\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.570923 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.570892 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d78bb467-37d0-4129-b59e-e8587a2b8ca8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.570923 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.570923 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d78bb467-37d0-4129-b59e-e8587a2b8ca8-crio-socket\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.571109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.570944 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-blntx\" (UniqueName: \"kubernetes.io/projected/998f90ca-500c-4933-b022-c7fdd401c2e3-kube-api-access-blntx\") pod \"downloads-6bcc868b7-6mgfv\" (UID: \"998f90ca-500c-4933-b022-c7fdd401c2e3\") " pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:06.571109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.570959 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5cf5\" (UniqueName: \"kubernetes.io/projected/d78bb467-37d0-4129-b59e-e8587a2b8ca8-kube-api-access-h5cf5\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.571109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.570996 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d78bb467-37d0-4129-b59e-e8587a2b8ca8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.571109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.571024 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d78bb467-37d0-4129-b59e-e8587a2b8ca8-data-volume\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.571109 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.571038 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/d78bb467-37d0-4129-b59e-e8587a2b8ca8-crio-socket\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.571432 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.571410 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/d78bb467-37d0-4129-b59e-e8587a2b8ca8-data-volume\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.571476 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.571460 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/d78bb467-37d0-4129-b59e-e8587a2b8ca8-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.574235 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.573829 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/d78bb467-37d0-4129-b59e-e8587a2b8ca8-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.584529 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.584501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-blntx\" (UniqueName: \"kubernetes.io/projected/998f90ca-500c-4933-b022-c7fdd401c2e3-kube-api-access-blntx\") pod \"downloads-6bcc868b7-6mgfv\" (UID: \"998f90ca-500c-4933-b022-c7fdd401c2e3\") " pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:06.585705 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.585687 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5cf5\" (UniqueName: \"kubernetes.io/projected/d78bb467-37d0-4129-b59e-e8587a2b8ca8-kube-api-access-h5cf5\") pod \"insights-runtime-extractor-s6zkr\" (UID: \"d78bb467-37d0-4129-b59e-e8587a2b8ca8\") " pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.740275 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.740244 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:06.748030 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.748007 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-s6zkr" Apr 17 17:10:06.897531 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.897507 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-s6zkr"] Apr 17 17:10:06.899877 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:06.899850 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd78bb467_37d0_4129_b59e_e8587a2b8ca8.slice/crio-3a8f2bfc1d1cae6889c1f2f3154ebaeb08098be8b7b53f6ba9864efc41dea9dd WatchSource:0}: Error finding container 3a8f2bfc1d1cae6889c1f2f3154ebaeb08098be8b7b53f6ba9864efc41dea9dd: Status 404 returned error can't find the container with id 3a8f2bfc1d1cae6889c1f2f3154ebaeb08098be8b7b53f6ba9864efc41dea9dd Apr 17 17:10:06.905857 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:06.905791 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-6mgfv"] Apr 17 17:10:06.908905 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:06.908877 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod998f90ca_500c_4933_b022_c7fdd401c2e3.slice/crio-ad4ef6a7d66479e98b6439417adc934a7a9c70d5e74bb3ee355fa9c65889ebfd WatchSource:0}: Error finding container ad4ef6a7d66479e98b6439417adc934a7a9c70d5e74bb3ee355fa9c65889ebfd: Status 404 returned error can't find the container with id ad4ef6a7d66479e98b6439417adc934a7a9c70d5e74bb3ee355fa9c65889ebfd Apr 17 17:10:07.852672 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:07.852637 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s6zkr" event={"ID":"d78bb467-37d0-4129-b59e-e8587a2b8ca8","Type":"ContainerStarted","Data":"b05b0beaaa83b2270d6c6cf6b5e04e6a9a9842fb2fc4fd14a2e0fa8f81bfa7ee"} Apr 17 17:10:07.852672 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:07.852679 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s6zkr" event={"ID":"d78bb467-37d0-4129-b59e-e8587a2b8ca8","Type":"ContainerStarted","Data":"7b0f55cdb5321210aaaff90dbd132a7dcff576b4e72c8a29021bc8f194c1c199"} Apr 17 17:10:07.853142 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:07.852693 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s6zkr" event={"ID":"d78bb467-37d0-4129-b59e-e8587a2b8ca8","Type":"ContainerStarted","Data":"3a8f2bfc1d1cae6889c1f2f3154ebaeb08098be8b7b53f6ba9864efc41dea9dd"} Apr 17 17:10:07.853779 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:07.853756 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6mgfv" event={"ID":"998f90ca-500c-4933-b022-c7fdd401c2e3","Type":"ContainerStarted","Data":"ad4ef6a7d66479e98b6439417adc934a7a9c70d5e74bb3ee355fa9c65889ebfd"} Apr 17 17:10:09.731895 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.731859 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-9585658b7-qhzc4"] Apr 17 17:10:09.735277 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.735247 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.738055 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.738017 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:10:09.738055 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.738037 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:10:09.739254 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.739232 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:10:09.739728 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.739507 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-wb5f4\"" Apr 17 17:10:09.739728 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.739544 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:10:09.739728 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.739554 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:10:09.745557 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.745536 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9585658b7-qhzc4"] Apr 17 17:10:09.745715 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.745693 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:10:09.795989 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.795932 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-oauth-config\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.796140 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.796004 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-trusted-ca-bundle\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.796140 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.796083 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-serving-cert\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.796140 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.796118 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-service-ca\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.796270 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.796172 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-oauth-serving-cert\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.796270 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.796251 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bxgcq\" (UniqueName: \"kubernetes.io/projected/dd0c121c-18b4-4637-ab1a-edd22fb56a23-kube-api-access-bxgcq\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.796337 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.796308 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-config\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.862653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.862623 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-s6zkr" event={"ID":"d78bb467-37d0-4129-b59e-e8587a2b8ca8","Type":"ContainerStarted","Data":"2381afe3ee3ab0f3045843209358c6e64cdfe18a9ad3dae232220dfac4829b28"} Apr 17 17:10:09.880687 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.880638 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-s6zkr" podStartSLOduration=1.631170719 podStartE2EDuration="3.880620329s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:06.953500902 +0000 UTC m=+145.026088074" lastFinishedPulling="2026-04-17 17:10:09.202950504 +0000 UTC m=+147.275537684" observedRunningTime="2026-04-17 17:10:09.878810632 +0000 UTC m=+147.951397828" watchObservedRunningTime="2026-04-17 17:10:09.880620329 +0000 UTC m=+147.953207703" Apr 17 17:10:09.897645 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.897598 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-serving-cert\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.897769 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.897665 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-service-ca\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.897769 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.897720 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-oauth-serving-cert\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.897769 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.897755 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bxgcq\" (UniqueName: \"kubernetes.io/projected/dd0c121c-18b4-4637-ab1a-edd22fb56a23-kube-api-access-bxgcq\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.897929 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.897901 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-config\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.897986 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.897964 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-oauth-config\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.898033 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.898015 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-trusted-ca-bundle\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.898394 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.898372 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-service-ca\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.898709 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.898687 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-config\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.898865 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.898811 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-oauth-serving-cert\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.899025 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.899005 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-trusted-ca-bundle\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.900293 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.900272 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-serving-cert\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.900372 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.900361 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-oauth-config\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:09.905808 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:09.905789 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bxgcq\" (UniqueName: \"kubernetes.io/projected/dd0c121c-18b4-4637-ab1a-edd22fb56a23-kube-api-access-bxgcq\") pod \"console-9585658b7-qhzc4\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:10.046841 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:10.046766 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:10.182931 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:10.182876 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-9585658b7-qhzc4"] Apr 17 17:10:10.186658 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:10.186628 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd0c121c_18b4_4637_ab1a_edd22fb56a23.slice/crio-056c4f47e166ca25d3aadfb6b28e3fa59ff1b07ab16e5773c70fb4f48162263a WatchSource:0}: Error finding container 056c4f47e166ca25d3aadfb6b28e3fa59ff1b07ab16e5773c70fb4f48162263a: Status 404 returned error can't find the container with id 056c4f47e166ca25d3aadfb6b28e3fa59ff1b07ab16e5773c70fb4f48162263a Apr 17 17:10:10.868045 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:10.867871 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9585658b7-qhzc4" event={"ID":"dd0c121c-18b4-4637-ab1a-edd22fb56a23","Type":"ContainerStarted","Data":"056c4f47e166ca25d3aadfb6b28e3fa59ff1b07ab16e5773c70fb4f48162263a"} Apr 17 17:10:13.878154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:13.878110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9585658b7-qhzc4" event={"ID":"dd0c121c-18b4-4637-ab1a-edd22fb56a23","Type":"ContainerStarted","Data":"9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b"} Apr 17 17:10:13.895727 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:13.895683 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-9585658b7-qhzc4" podStartSLOduration=2.007654266 podStartE2EDuration="4.895669953s" podCreationTimestamp="2026-04-17 17:10:09 +0000 UTC" firstStartedPulling="2026-04-17 17:10:10.188755842 +0000 UTC m=+148.261343021" lastFinishedPulling="2026-04-17 17:10:13.076771519 +0000 UTC m=+151.149358708" observedRunningTime="2026-04-17 17:10:13.893985765 +0000 UTC m=+151.966572960" watchObservedRunningTime="2026-04-17 17:10:13.895669953 +0000 UTC m=+151.968257146" Apr 17 17:10:15.336378 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.335003 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2"] Apr 17 17:10:15.342680 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.341839 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.345454 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.345280 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 17 17:10:15.346144 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.346120 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:10:15.346621 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.346372 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:10:15.346621 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.346516 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:10:15.346621 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.346586 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:10:15.346822 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.346716 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-ldxtt\"" Apr 17 17:10:15.355239 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.353670 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2"] Apr 17 17:10:15.369558 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.369506 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rldj4"] Apr 17 17:10:15.373520 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.373053 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.374394 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.374371 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-cgfvj"] Apr 17 17:10:15.375766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.375736 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 17 17:10:15.376022 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.376000 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 17 17:10:15.376414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.376229 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 17 17:10:15.376414 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.376309 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-phjkv\"" Apr 17 17:10:15.380576 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.380541 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.383230 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.383192 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rldj4"] Apr 17 17:10:15.384324 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.383883 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:10:15.384324 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.384023 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:10:15.384324 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.384072 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:10:15.385198 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.385037 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wcb6d\"" Apr 17 17:10:15.448092 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448064 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-textfile\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448092 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448105 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448152 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6j6vp\" (UniqueName: \"kubernetes.io/projected/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-api-access-6j6vp\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.448296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448196 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsp2p\" (UniqueName: \"kubernetes.io/projected/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-kube-api-access-hsp2p\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.448296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448221 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.448296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.448296 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448281 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-tls\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448307 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-root\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfmfk\" (UniqueName: \"kubernetes.io/projected/9384a44b-c862-49d8-8220-6aa55e030cd5-kube-api-access-vfmfk\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448378 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9384a44b-c862-49d8-8220-6aa55e030cd5-metrics-client-ca\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448398 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-sys\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448422 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-wtmp\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.448504 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448479 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.449437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.449437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448555 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.449437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448601 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.449437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448658 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-accelerators-collector-config\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.449437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.448694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.549115 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549083 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-root\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfmfk\" (UniqueName: \"kubernetes.io/projected/9384a44b-c862-49d8-8220-6aa55e030cd5-kube-api-access-vfmfk\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9384a44b-c862-49d8-8220-6aa55e030cd5-metrics-client-ca\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549192 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-sys\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549217 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.549300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549242 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-wtmp\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549276 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.549554 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549190 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-root\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549634 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549591 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.549694 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549644 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.549694 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549671 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.549781 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549700 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-accelerators-collector-config\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549781 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-wtmp\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549781 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549738 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.549781 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549772 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-textfile\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549963 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549800 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.549963 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:15.549806 2574 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 17 17:10:15.549963 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549842 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6j6vp\" (UniqueName: \"kubernetes.io/projected/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-api-access-6j6vp\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.549963 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:15.549879 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-tls podName:11ea47a9-7c97-4dce-8a6c-04d5bd7b930b nodeName:}" failed. No retries permitted until 2026-04-17 17:10:16.049858745 +0000 UTC m=+154.122445921 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-rldj4" (UID: "11ea47a9-7c97-4dce-8a6c-04d5bd7b930b") : secret "kube-state-metrics-tls" not found Apr 17 17:10:15.549963 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549905 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsp2p\" (UniqueName: \"kubernetes.io/projected/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-kube-api-access-hsp2p\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.549963 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.550229 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.549972 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.550229 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.550004 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-tls\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.550229 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.550097 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9384a44b-c862-49d8-8220-6aa55e030cd5-sys\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.550623 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9384a44b-c862-49d8-8220-6aa55e030cd5-metrics-client-ca\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.550981 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-accelerators-collector-config\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.551157 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.551235 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.551428 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-textfile\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:15.551590 2574 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:15.551670 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-tls podName:9384a44b-c862-49d8-8220-6aa55e030cd5 nodeName:}" failed. No retries permitted until 2026-04-17 17:10:16.051651347 +0000 UTC m=+154.124238522 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-tls") pod "node-exporter-cgfvj" (UID: "9384a44b-c862-49d8-8220-6aa55e030cd5") : secret "node-exporter-tls" not found Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.552628 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.553196 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.553155 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.554302 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.554001 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.554450 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.554427 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.554505 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.554455 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.556079 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.556059 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.562016 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.561974 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6j6vp\" (UniqueName: \"kubernetes.io/projected/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-api-access-6j6vp\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:15.562301 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.562278 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfmfk\" (UniqueName: \"kubernetes.io/projected/9384a44b-c862-49d8-8220-6aa55e030cd5-kube-api-access-vfmfk\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:15.564951 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.564931 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsp2p\" (UniqueName: \"kubernetes.io/projected/117d903d-3f00-4f06-9bf0-fac54cc8d1c4-kube-api-access-hsp2p\") pod \"openshift-state-metrics-9d44df66c-rcls2\" (UID: \"117d903d-3f00-4f06-9bf0-fac54cc8d1c4\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.658231 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.658156 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" Apr 17 17:10:15.809818 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.809781 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2"] Apr 17 17:10:15.813171 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:15.813143 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod117d903d_3f00_4f06_9bf0_fac54cc8d1c4.slice/crio-9266f00fe1eb5d417921efaa7931ec9fbdf3c6c55d2581d4c9891fccec6e79bc WatchSource:0}: Error finding container 9266f00fe1eb5d417921efaa7931ec9fbdf3c6c55d2581d4c9891fccec6e79bc: Status 404 returned error can't find the container with id 9266f00fe1eb5d417921efaa7931ec9fbdf3c6c55d2581d4c9891fccec6e79bc Apr 17 17:10:15.894507 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.894421 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" event={"ID":"117d903d-3f00-4f06-9bf0-fac54cc8d1c4","Type":"ContainerStarted","Data":"06b732f220c9727ef89ce9cc3516d5ae48018b7c8f1a71371244af55f2019e37"} Apr 17 17:10:15.894507 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:15.894459 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" event={"ID":"117d903d-3f00-4f06-9bf0-fac54cc8d1c4","Type":"ContainerStarted","Data":"9266f00fe1eb5d417921efaa7931ec9fbdf3c6c55d2581d4c9891fccec6e79bc"} Apr 17 17:10:16.056460 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.056428 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-tls\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:16.056663 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.056482 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:16.059203 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.059173 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/9384a44b-c862-49d8-8220-6aa55e030cd5-node-exporter-tls\") pod \"node-exporter-cgfvj\" (UID: \"9384a44b-c862-49d8-8220-6aa55e030cd5\") " pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:16.059309 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.059206 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/11ea47a9-7c97-4dce-8a6c-04d5bd7b930b-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-rldj4\" (UID: \"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:16.290131 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.289984 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" Apr 17 17:10:16.295944 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.295908 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-cgfvj" Apr 17 17:10:16.423590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.423522 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:10:16.428981 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.428953 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.429750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.429193 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-rldj4"] Apr 17 17:10:16.431582 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.431560 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2r8lw\"" Apr 17 17:10:16.431689 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.431641 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:10:16.432080 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.431567 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:10:16.432080 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.431921 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:10:16.432887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.432359 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:10:16.432887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.432372 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:10:16.432887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.432400 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:10:16.432887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.432639 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:10:16.432887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.432721 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:10:16.432887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.432784 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:10:16.433397 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:16.433372 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ea47a9_7c97_4dce_8a6c_04d5bd7b930b.slice/crio-c7d004fd43cbf7149f5c2a67fd85193011cd8ab83fe05870d4fa235490aa32bb WatchSource:0}: Error finding container c7d004fd43cbf7149f5c2a67fd85193011cd8ab83fe05870d4fa235490aa32bb: Status 404 returned error can't find the container with id c7d004fd43cbf7149f5c2a67fd85193011cd8ab83fe05870d4fa235490aa32bb Apr 17 17:10:16.447068 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.447042 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:10:16.561384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561282 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561339 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561370 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-web-config\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561420 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-config-volume\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561438 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5sgbp\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-kube-api-access-5sgbp\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561458 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561530 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-config-out\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561569 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561946 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561657 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561946 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561713 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561946 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561747 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.561946 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.561774 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662544 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662510 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-config-volume\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662557 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5sgbp\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-kube-api-access-5sgbp\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662593 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662672 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-config-out\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662718 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662743 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.662786 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662764 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.663089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662790 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.663089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662815 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.663089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662841 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.663089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662883 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.663089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662919 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.663089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.662941 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-web-config\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.664066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.664015 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.664637 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.664582 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.664834 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.664783 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.668105 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.668063 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-tls-assets\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.668690 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.668463 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.669141 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.669534 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-web-config\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.669928 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.670054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670354 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.670224 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670423 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.670400 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-config-out\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.670650 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.670624 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-config-volume\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.672966 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.672924 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5sgbp\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-kube-api-access-5sgbp\") pod \"alertmanager-main-0\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.745734 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.745508 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:10:16.902204 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.902153 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:10:16.904191 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.904141 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" event={"ID":"117d903d-3f00-4f06-9bf0-fac54cc8d1c4","Type":"ContainerStarted","Data":"e03cd5d1a687a302d83f87a56858f5950d251762619334a266bbe303b4bdd8be"} Apr 17 17:10:16.906321 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.906300 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" event={"ID":"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b","Type":"ContainerStarted","Data":"c7d004fd43cbf7149f5c2a67fd85193011cd8ab83fe05870d4fa235490aa32bb"} Apr 17 17:10:16.907841 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:16.907817 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgfvj" event={"ID":"9384a44b-c862-49d8-8220-6aa55e030cd5","Type":"ContainerStarted","Data":"cfd7a044c820697fe1ce4e275b8f5d175c49285d77b1d9e2a86455248612e2e5"} Apr 17 17:10:17.082513 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:17.082481 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2197f45c_d49b_4455_816f_9fb39452dd65.slice/crio-63ac469d9718c5cae85bd3c31c5e2ba89dddc1ee942a3dd62ff6dfa51fee4c94 WatchSource:0}: Error finding container 63ac469d9718c5cae85bd3c31c5e2ba89dddc1ee942a3dd62ff6dfa51fee4c94: Status 404 returned error can't find the container with id 63ac469d9718c5cae85bd3c31c5e2ba89dddc1ee942a3dd62ff6dfa51fee4c94 Apr 17 17:10:17.423231 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.423195 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-85d87b4479-nqbnz"] Apr 17 17:10:17.427639 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.427595 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.432386 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.432192 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 17 17:10:17.432386 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.432280 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 17 17:10:17.432804 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.432781 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 17 17:10:17.432898 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.432847 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-bq48cloij614j\"" Apr 17 17:10:17.433074 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.433056 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 17 17:10:17.433322 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.433299 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-9kd7m\"" Apr 17 17:10:17.433579 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.433258 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 17 17:10:17.440833 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.440808 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85d87b4479-nqbnz"] Apr 17 17:10:17.571318 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571243 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571318 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571292 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571496 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571345 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571496 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571391 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-tls\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571496 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571417 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-grpc-tls\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571496 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571452 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571496 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571495 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-metrics-client-ca\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.571726 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.571516 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nnk9t\" (UniqueName: \"kubernetes.io/projected/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-kube-api-access-nnk9t\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.672965 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.672915 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-metrics-client-ca\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673122 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.672977 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nnk9t\" (UniqueName: \"kubernetes.io/projected/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-kube-api-access-nnk9t\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673122 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673076 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673122 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673106 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673141 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-tls\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673212 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-grpc-tls\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673254 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.673777 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.673725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-metrics-client-ca\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.677941 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.677180 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-grpc-tls\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.677941 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.677205 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.677941 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.677525 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.677941 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.677588 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.678208 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.677964 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.678862 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.678839 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-secret-thanos-querier-tls\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.680927 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.680896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nnk9t\" (UniqueName: \"kubernetes.io/projected/d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd-kube-api-access-nnk9t\") pod \"thanos-querier-85d87b4479-nqbnz\" (UID: \"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd\") " pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.740930 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.740889 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:17.879235 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:17.879146 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-m492r" podUID="5ec73ca8-f486-4ad3-b44e-4e1b5815d3de" Apr 17 17:10:17.910617 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:17.910557 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-n4m6l" podUID="05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b" Apr 17 17:10:17.912062 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.912022 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgfvj" event={"ID":"9384a44b-c862-49d8-8220-6aa55e030cd5","Type":"ContainerStarted","Data":"7194ab1bb2cae4510b4e4eebd1319af40cb3489ea50cfb49056c3ae9a53b724a"} Apr 17 17:10:17.913428 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.913396 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"63ac469d9718c5cae85bd3c31c5e2ba89dddc1ee942a3dd62ff6dfa51fee4c94"} Apr 17 17:10:17.915462 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.915437 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" event={"ID":"117d903d-3f00-4f06-9bf0-fac54cc8d1c4","Type":"ContainerStarted","Data":"7e4fd0dd1525c27e3b275e983240b0dc54736443d0857d34ecf2668ad3826fb5"} Apr 17 17:10:17.915462 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.915455 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m492r" Apr 17 17:10:17.947313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:17.947263 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-rcls2" podStartSLOduration=1.605463168 podStartE2EDuration="2.947248802s" podCreationTimestamp="2026-04-17 17:10:15 +0000 UTC" firstStartedPulling="2026-04-17 17:10:15.960955668 +0000 UTC m=+154.033542856" lastFinishedPulling="2026-04-17 17:10:17.302741302 +0000 UTC m=+155.375328490" observedRunningTime="2026-04-17 17:10:17.94557644 +0000 UTC m=+156.018163634" watchObservedRunningTime="2026-04-17 17:10:17.947248802 +0000 UTC m=+156.019835995" Apr 17 17:10:19.745457 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.745410 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6c68877994-q8bps"] Apr 17 17:10:19.748988 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.748965 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.752683 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.752650 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-btnqg\"" Apr 17 17:10:19.752784 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.752686 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-a0ucr35fjrghs\"" Apr 17 17:10:19.752784 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.752699 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 17 17:10:19.752784 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.752650 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 17 17:10:19.752784 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.752659 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 17 17:10:19.753006 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.752701 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 17 17:10:19.756544 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.756522 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6c68877994-q8bps"] Apr 17 17:10:19.898680 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.898636 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-secret-metrics-server-client-certs\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.898872 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.898750 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-client-ca-bundle\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.898872 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.898788 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4465ec50-e774-47fa-a778-c5e02c74b6ec-audit-log\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.898872 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.898821 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4465ec50-e774-47fa-a778-c5e02c74b6ec-metrics-server-audit-profiles\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.899038 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.898916 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgzlf\" (UniqueName: \"kubernetes.io/projected/4465ec50-e774-47fa-a778-c5e02c74b6ec-kube-api-access-kgzlf\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.899038 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.898966 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4465ec50-e774-47fa-a778-c5e02c74b6ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:19.899038 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:19.899017 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-secret-metrics-server-tls\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.000582 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.000491 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4465ec50-e774-47fa-a778-c5e02c74b6ec-audit-log\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.000810 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.000538 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4465ec50-e774-47fa-a778-c5e02c74b6ec-metrics-server-audit-profiles\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.000964 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.000889 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgzlf\" (UniqueName: \"kubernetes.io/projected/4465ec50-e774-47fa-a778-c5e02c74b6ec-kube-api-access-kgzlf\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.000964 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.000940 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4465ec50-e774-47fa-a778-c5e02c74b6ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.001069 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.000988 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-secret-metrics-server-tls\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.001069 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.001032 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-secret-metrics-server-client-certs\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.001069 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.001039 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/4465ec50-e774-47fa-a778-c5e02c74b6ec-audit-log\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.001199 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.001133 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-client-ca-bundle\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.001558 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.001536 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/4465ec50-e774-47fa-a778-c5e02c74b6ec-metrics-server-audit-profiles\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.001903 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.001857 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4465ec50-e774-47fa-a778-c5e02c74b6ec-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.004152 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.004133 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-client-ca-bundle\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.004285 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.004268 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-secret-metrics-server-client-certs\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.004409 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.004384 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/4465ec50-e774-47fa-a778-c5e02c74b6ec-secret-metrics-server-tls\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.009397 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.009366 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgzlf\" (UniqueName: \"kubernetes.io/projected/4465ec50-e774-47fa-a778-c5e02c74b6ec-kube-api-access-kgzlf\") pod \"metrics-server-6c68877994-q8bps\" (UID: \"4465ec50-e774-47fa-a778-c5e02c74b6ec\") " pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.047128 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.047100 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:20.047227 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.047146 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:20.048710 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.048687 2574 patch_prober.go:28] interesting pod/console-9585658b7-qhzc4 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.15:8443/health\": dial tcp 10.133.0.15:8443: connect: connection refused" start-of-body= Apr 17 17:10:20.048811 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.048744 2574 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-9585658b7-qhzc4" podUID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" containerName="console" probeResult="failure" output="Get \"https://10.133.0.15:8443/health\": dial tcp 10.133.0.15:8443: connect: connection refused" Apr 17 17:10:20.060740 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.060686 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:20.127121 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.127093 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2"] Apr 17 17:10:20.131620 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.131578 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:20.134279 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.134255 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-rkmgw\"" Apr 17 17:10:20.134400 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.134258 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 17:10:20.136953 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.136921 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2"] Apr 17 17:10:20.203636 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.203582 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dad46c4-fb79-4503-9293-2ff17aa1410d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sccc2\" (UID: \"5dad46c4-fb79-4503-9293-2ff17aa1410d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:20.304648 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.304548 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dad46c4-fb79-4503-9293-2ff17aa1410d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sccc2\" (UID: \"5dad46c4-fb79-4503-9293-2ff17aa1410d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:20.304825 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:20.304730 2574 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: secret "monitoring-plugin-cert" not found Apr 17 17:10:20.304825 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:20.304807 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5dad46c4-fb79-4503-9293-2ff17aa1410d-monitoring-plugin-cert podName:5dad46c4-fb79-4503-9293-2ff17aa1410d nodeName:}" failed. No retries permitted until 2026-04-17 17:10:20.80478919 +0000 UTC m=+158.877376364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/5dad46c4-fb79-4503-9293-2ff17aa1410d-monitoring-plugin-cert") pod "monitoring-plugin-7dccd58f55-sccc2" (UID: "5dad46c4-fb79-4503-9293-2ff17aa1410d") : secret "monitoring-plugin-cert" not found Apr 17 17:10:20.810422 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.810382 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dad46c4-fb79-4503-9293-2ff17aa1410d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sccc2\" (UID: \"5dad46c4-fb79-4503-9293-2ff17aa1410d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:20.813237 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:20.813212 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5dad46c4-fb79-4503-9293-2ff17aa1410d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-sccc2\" (UID: \"5dad46c4-fb79-4503-9293-2ff17aa1410d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:21.044244 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.044205 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:21.551226 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.551199 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:10:21.555698 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.555670 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.558202 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.558177 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:10:21.558549 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.558531 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:10:21.559432 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559410 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:10:21.559536 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559468 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:10:21.559769 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559753 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:10:21.559900 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559878 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:10:21.559997 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559970 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:10:21.559997 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559813 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:10:21.560106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559821 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:10:21.560106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559821 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6tftur1in7ftc\"" Apr 17 17:10:21.560106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559861 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:10:21.560106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559848 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:10:21.560265 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.559803 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rs8sk\"" Apr 17 17:10:21.561659 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.561639 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:10:21.567784 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.567765 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:10:21.719997 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.719958 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720170 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720003 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznl7\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-kube-api-access-xznl7\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720170 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720037 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720170 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720127 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720170 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720166 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720347 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720189 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720347 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720226 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720347 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720298 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720347 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720340 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720477 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720389 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config-out\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720477 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720423 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720477 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720446 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720504 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720536 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720721 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720674 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720721 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720714 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-web-config\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.720799 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.720742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.821778 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821691 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.821778 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.821778 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821778 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821797 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821816 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821873 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-web-config\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821924 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.821984 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822005 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xznl7\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-kube-api-access-xznl7\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822035 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822071 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822100 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822127 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822157 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822186 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.822289 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.822245 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config-out\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.823197 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.823035 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.823743 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.823716 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.824978 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.824862 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.824978 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.824884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-web-config\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.825127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.825092 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.825275 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.825250 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.826137 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.825853 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.826137 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.826096 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.827022 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.826986 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.827328 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.827251 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.827461 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.827437 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.827845 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.827805 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.828165 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.828144 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config-out\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.828238 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.828154 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.828421 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.828401 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.828876 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.828859 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.829493 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.829474 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.832590 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.832567 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznl7\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-kube-api-access-xznl7\") pod \"prometheus-k8s-0\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:21.868588 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:21.868559 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:22.731299 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:22.731258 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:10:22.734471 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:22.734448 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5ec73ca8-f486-4ad3-b44e-4e1b5815d3de-metrics-tls\") pod \"dns-default-m492r\" (UID: \"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de\") " pod="openshift-dns/dns-default-m492r" Apr 17 17:10:22.831977 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:22.831898 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:10:22.834616 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:22.834578 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b-cert\") pod \"ingress-canary-n4m6l\" (UID: \"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b\") " pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:10:23.019100 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:23.019030 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qdqk7\"" Apr 17 17:10:23.027365 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:23.027341 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-m492r" Apr 17 17:10:24.790587 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.785813 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2"] Apr 17 17:10:24.791205 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:24.791110 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dad46c4_fb79_4503_9293_2ff17aa1410d.slice/crio-555afd2a02eeb8e459f158697d6dd16aec4031993b0708b2089b103c14967bbc WatchSource:0}: Error finding container 555afd2a02eeb8e459f158697d6dd16aec4031993b0708b2089b103c14967bbc: Status 404 returned error can't find the container with id 555afd2a02eeb8e459f158697d6dd16aec4031993b0708b2089b103c14967bbc Apr 17 17:10:24.845818 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.845781 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-m492r"] Apr 17 17:10:24.857963 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:24.857918 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ec73ca8_f486_4ad3_b44e_4e1b5815d3de.slice/crio-16821c7cbc9ef5902850617c722b24762cce04804e608d936d708e759adda90e WatchSource:0}: Error finding container 16821c7cbc9ef5902850617c722b24762cce04804e608d936d708e759adda90e: Status 404 returned error can't find the container with id 16821c7cbc9ef5902850617c722b24762cce04804e608d936d708e759adda90e Apr 17 17:10:24.947919 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.947844 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m492r" event={"ID":"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de","Type":"ContainerStarted","Data":"16821c7cbc9ef5902850617c722b24762cce04804e608d936d708e759adda90e"} Apr 17 17:10:24.954194 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.954019 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:10:24.959537 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.959441 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" event={"ID":"5dad46c4-fb79-4503-9293-2ff17aa1410d","Type":"ContainerStarted","Data":"555afd2a02eeb8e459f158697d6dd16aec4031993b0708b2089b103c14967bbc"} Apr 17 17:10:24.963657 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.963596 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-6mgfv" event={"ID":"998f90ca-500c-4933-b022-c7fdd401c2e3","Type":"ContainerStarted","Data":"e0a46815d915f906e6e56b4845972ed1e29b312ed1eb409611e2d819a31a859c"} Apr 17 17:10:24.964758 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.964698 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:24.971867 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.971464 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" event={"ID":"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b","Type":"ContainerStarted","Data":"78018ad128b2bec9bcbc9cb78b4773368eae91a09d5d1fe21a1bb81f22fc1059"} Apr 17 17:10:24.971867 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.971506 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" event={"ID":"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b","Type":"ContainerStarted","Data":"0bbbbc580b413a53d1120f38a6fcddcf54fde320c49ec989f5656f16b622ae63"} Apr 17 17:10:24.971867 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.971519 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" event={"ID":"11ea47a9-7c97-4dce-8a6c-04d5bd7b930b","Type":"ContainerStarted","Data":"8f9f9cd3e585ffc5cddc3a96db738fa0e6fc844fab2bf0c30da3b384690570a6"} Apr 17 17:10:24.974363 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.974150 2574 generic.go:358] "Generic (PLEG): container finished" podID="9384a44b-c862-49d8-8220-6aa55e030cd5" containerID="7194ab1bb2cae4510b4e4eebd1319af40cb3489ea50cfb49056c3ae9a53b724a" exitCode=0 Apr 17 17:10:24.974363 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.974187 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgfvj" event={"ID":"9384a44b-c862-49d8-8220-6aa55e030cd5","Type":"ContainerDied","Data":"7194ab1bb2cae4510b4e4eebd1319af40cb3489ea50cfb49056c3ae9a53b724a"} Apr 17 17:10:24.982123 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.982069 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-6mgfv" podStartSLOduration=1.3211724249999999 podStartE2EDuration="18.982051533s" podCreationTimestamp="2026-04-17 17:10:06 +0000 UTC" firstStartedPulling="2026-04-17 17:10:06.910769486 +0000 UTC m=+144.983356658" lastFinishedPulling="2026-04-17 17:10:24.571648577 +0000 UTC m=+162.644235766" observedRunningTime="2026-04-17 17:10:24.980692688 +0000 UTC m=+163.053279883" watchObservedRunningTime="2026-04-17 17:10:24.982051533 +0000 UTC m=+163.054638720" Apr 17 17:10:24.983713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.983653 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-6mgfv" Apr 17 17:10:24.986445 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.986366 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6c68877994-q8bps"] Apr 17 17:10:24.989940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:24.989894 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-85d87b4479-nqbnz"] Apr 17 17:10:24.992845 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:24.992802 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4465ec50_e774_47fa_a778_c5e02c74b6ec.slice/crio-d0ef81285d253edc32a6ac40651198f03502eab6893d9fe5d53f0ff3ca1306f4 WatchSource:0}: Error finding container d0ef81285d253edc32a6ac40651198f03502eab6893d9fe5d53f0ff3ca1306f4: Status 404 returned error can't find the container with id d0ef81285d253edc32a6ac40651198f03502eab6893d9fe5d53f0ff3ca1306f4 Apr 17 17:10:24.997207 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:10:24.997186 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8d765b1_f40e_4d3e_8cc1_a854f9ad75cd.slice/crio-68d317f41eccd217dfaaac781be3866e6bb73bd7bd1f69a58ea3b8f735d864ff WatchSource:0}: Error finding container 68d317f41eccd217dfaaac781be3866e6bb73bd7bd1f69a58ea3b8f735d864ff: Status 404 returned error can't find the container with id 68d317f41eccd217dfaaac781be3866e6bb73bd7bd1f69a58ea3b8f735d864ff Apr 17 17:10:25.028794 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.028741 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-rldj4" podStartSLOduration=1.939379786 podStartE2EDuration="10.028719738s" podCreationTimestamp="2026-04-17 17:10:15 +0000 UTC" firstStartedPulling="2026-04-17 17:10:16.43514456 +0000 UTC m=+154.507731739" lastFinishedPulling="2026-04-17 17:10:24.524484502 +0000 UTC m=+162.597071691" observedRunningTime="2026-04-17 17:10:25.007521941 +0000 UTC m=+163.080109136" watchObservedRunningTime="2026-04-17 17:10:25.028719738 +0000 UTC m=+163.101306933" Apr 17 17:10:25.980878 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.980844 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" exitCode=0 Apr 17 17:10:25.981468 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.980936 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} Apr 17 17:10:25.981468 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.980975 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"55e84bfb53fb04dbd0970800db403c6a87806fdbcd77b2782472386e811ee3f3"} Apr 17 17:10:25.986909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.983329 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72" exitCode=0 Apr 17 17:10:25.986909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.983435 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72"} Apr 17 17:10:25.988823 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.988777 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"68d317f41eccd217dfaaac781be3866e6bb73bd7bd1f69a58ea3b8f735d864ff"} Apr 17 17:10:25.994141 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.993253 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgfvj" event={"ID":"9384a44b-c862-49d8-8220-6aa55e030cd5","Type":"ContainerStarted","Data":"59b31e9480b0cdc8b7b3430e387cb6065ac155824eca432129a2b9a9d3823942"} Apr 17 17:10:25.994141 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.993285 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-cgfvj" event={"ID":"9384a44b-c862-49d8-8220-6aa55e030cd5","Type":"ContainerStarted","Data":"e9ae1046a9c76a1cc9f75aeb0cc3e28147f6a1d564cb1c73843b3874c86abbc9"} Apr 17 17:10:25.996493 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:25.996088 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" event={"ID":"4465ec50-e774-47fa-a778-c5e02c74b6ec","Type":"ContainerStarted","Data":"d0ef81285d253edc32a6ac40651198f03502eab6893d9fe5d53f0ff3ca1306f4"} Apr 17 17:10:26.063130 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:26.062189 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-cgfvj" podStartSLOduration=10.066421767 podStartE2EDuration="11.062159535s" podCreationTimestamp="2026-04-17 17:10:15 +0000 UTC" firstStartedPulling="2026-04-17 17:10:16.308684172 +0000 UTC m=+154.381271360" lastFinishedPulling="2026-04-17 17:10:17.304421942 +0000 UTC m=+155.377009128" observedRunningTime="2026-04-17 17:10:26.03347755 +0000 UTC m=+164.106064745" watchObservedRunningTime="2026-04-17 17:10:26.062159535 +0000 UTC m=+164.134746730" Apr 17 17:10:26.420639 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:26.420582 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:30.052592 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:30.052554 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:30.057072 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:30.057038 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:31.427579 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:31.427371 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" podUID="4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" containerName="registry" containerID="cri-o://ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5" gracePeriod=30 Apr 17 17:10:31.564904 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:31.564870 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9585658b7-qhzc4"] Apr 17 17:10:31.985464 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:31.985438 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:32.019066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.016896 2574 generic.go:358] "Generic (PLEG): container finished" podID="4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" containerID="ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5" exitCode=0 Apr 17 17:10:32.019066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.017084 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" Apr 17 17:10:32.019066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.017121 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" event={"ID":"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960","Type":"ContainerDied","Data":"ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5"} Apr 17 17:10:32.019066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.017150 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5cd9d6769f-6448l" event={"ID":"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960","Type":"ContainerDied","Data":"96b88ef1dd02594ddbf1a7d1a3990b31efa6d236ab32f9567056fb85e2820a01"} Apr 17 17:10:32.019066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.017170 2574 scope.go:117] "RemoveContainer" containerID="ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5" Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041647 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-installation-pull-secrets\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041711 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-ca-trust-extracted\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041745 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dzc84\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-kube-api-access-dzc84\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041824 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-image-registry-private-configuration\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041854 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-trusted-ca\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041899 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-bound-sa-token\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.041947 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-certificates\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.042160 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") pod \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\" (UID: \"4adf7a08-e39c-4ebd-9fb8-8989bbdd8960\") " Apr 17 17:10:32.042954 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.042838 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:32.044337 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.044231 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:32.045174 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.044977 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:32.046330 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.046206 2574 scope.go:117] "RemoveContainer" containerID="ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5" Apr 17 17:10:32.047243 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:32.047143 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5\": container with ID starting with ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5 not found: ID does not exist" containerID="ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5" Apr 17 17:10:32.047243 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.047181 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5"} err="failed to get container status \"ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5\": rpc error: code = NotFound desc = could not find container \"ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5\": container with ID starting with ab25dd08e32bf7425be03400726dea4fe85531087258e3faec865d8fbb9e57d5 not found: ID does not exist" Apr 17 17:10:32.047243 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.047226 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:32.047538 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.047506 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:32.048267 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.048242 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:32.048425 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.048394 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-kube-api-access-dzc84" (OuterVolumeSpecName: "kube-api-access-dzc84") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "kube-api-access-dzc84". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:32.053087 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.053063 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" (UID: "4adf7a08-e39c-4ebd-9fb8-8989bbdd8960"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:10:32.143719 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143691 2574 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-image-registry-private-configuration\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143720 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-trusted-ca\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143735 2574 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-bound-sa-token\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143748 2574 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-certificates\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143762 2574 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-registry-tls\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143774 2574 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-installation-pull-secrets\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143788 2574 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-ca-trust-extracted\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.143828 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.143801 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dzc84\" (UniqueName: \"kubernetes.io/projected/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960-kube-api-access-dzc84\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:32.343059 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.343010 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-5cd9d6769f-6448l"] Apr 17 17:10:32.352260 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.352177 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-5cd9d6769f-6448l"] Apr 17 17:10:32.435053 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.431843 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:10:32.437345 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.437138 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-j45n5\"" Apr 17 17:10:32.443336 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.443305 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" path="/var/lib/kubelet/pods/4adf7a08-e39c-4ebd-9fb8-8989bbdd8960/volumes" Apr 17 17:10:32.443829 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.443808 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-n4m6l" Apr 17 17:10:32.698758 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:32.698003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-n4m6l"] Apr 17 17:10:33.022727 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.022666 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m492r" event={"ID":"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de","Type":"ContainerStarted","Data":"f4bc42a10c4ccf0f7e7550efc2f70070543d554e250e82504992c70d158d5b07"} Apr 17 17:10:33.022727 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.022707 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-m492r" event={"ID":"5ec73ca8-f486-4ad3-b44e-4e1b5815d3de","Type":"ContainerStarted","Data":"c750acdac96af75be5d72ed90172a1d5565464c0881ac56d8f332e7c25d97602"} Apr 17 17:10:33.024141 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.024073 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-m492r" Apr 17 17:10:33.025770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.025745 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} Apr 17 17:10:33.025770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.025773 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} Apr 17 17:10:33.025909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.025782 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} Apr 17 17:10:33.025909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.025790 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} Apr 17 17:10:33.025909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.025797 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} Apr 17 17:10:33.032739 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.032714 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n4m6l" event={"ID":"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b","Type":"ContainerStarted","Data":"df7082a90d77479fd8effe17ec13cb828ce4ca38c7beeed8fc190d7e7c28e8b3"} Apr 17 17:10:33.035788 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.035759 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d"} Apr 17 17:10:33.035896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.035792 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af"} Apr 17 17:10:33.035896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.035807 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8"} Apr 17 17:10:33.035896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.035819 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0"} Apr 17 17:10:33.035896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.035832 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36"} Apr 17 17:10:33.037192 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.037169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" event={"ID":"5dad46c4-fb79-4503-9293-2ff17aa1410d","Type":"ContainerStarted","Data":"2d56c4ef7387eb71076e3071f834a3c78f68eb3730741653bfcc96d332934f88"} Apr 17 17:10:33.038898 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.038877 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:33.041205 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.041183 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"e0226850011dbde16608453b2b207c4ae73d21e8c1c60df3eaa63da6d477fb95"} Apr 17 17:10:33.041302 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.041213 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"1b16ee93f94e22eb2d1e31d60c8fb033e1e7e3a942129af007e54755ea23aaf1"} Apr 17 17:10:33.041302 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.041227 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"bcb97629de0c24feabf112f38759909beb8ffb99705bc18894a539c3691ccfd1"} Apr 17 17:10:33.043492 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.043236 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" event={"ID":"4465ec50-e774-47fa-a778-c5e02c74b6ec","Type":"ContainerStarted","Data":"a1f0686709ca51c9081f5cfd4176375bf46cab94b25626f20f2e83be824772b1"} Apr 17 17:10:33.043996 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.043964 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" Apr 17 17:10:33.073074 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.073028 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" podStartSLOduration=7.051922328 podStartE2EDuration="14.073002125s" podCreationTimestamp="2026-04-17 17:10:19 +0000 UTC" firstStartedPulling="2026-04-17 17:10:24.995000092 +0000 UTC m=+163.067587279" lastFinishedPulling="2026-04-17 17:10:32.016079902 +0000 UTC m=+170.088667076" observedRunningTime="2026-04-17 17:10:33.072188892 +0000 UTC m=+171.144776085" watchObservedRunningTime="2026-04-17 17:10:33.073002125 +0000 UTC m=+171.145589319" Apr 17 17:10:33.074018 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.073908 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-m492r" podStartSLOduration=131.926955466 podStartE2EDuration="2m19.073896993s" podCreationTimestamp="2026-04-17 17:08:14 +0000 UTC" firstStartedPulling="2026-04-17 17:10:24.86982484 +0000 UTC m=+162.942412016" lastFinishedPulling="2026-04-17 17:10:32.016766358 +0000 UTC m=+170.089353543" observedRunningTime="2026-04-17 17:10:33.046291831 +0000 UTC m=+171.118879025" watchObservedRunningTime="2026-04-17 17:10:33.073896993 +0000 UTC m=+171.146484187" Apr 17 17:10:33.087581 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:33.087538 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-sccc2" podStartSLOduration=5.890606877 podStartE2EDuration="13.08752879s" podCreationTimestamp="2026-04-17 17:10:20 +0000 UTC" firstStartedPulling="2026-04-17 17:10:24.79629193 +0000 UTC m=+162.868879110" lastFinishedPulling="2026-04-17 17:10:31.993213841 +0000 UTC m=+170.065801023" observedRunningTime="2026-04-17 17:10:33.086485243 +0000 UTC m=+171.159072445" watchObservedRunningTime="2026-04-17 17:10:33.08752879 +0000 UTC m=+171.160116032" Apr 17 17:10:34.054475 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:34.054433 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerStarted","Data":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} Apr 17 17:10:34.086535 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:34.086481 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=6.0191265 podStartE2EDuration="13.0864655s" podCreationTimestamp="2026-04-17 17:10:21 +0000 UTC" firstStartedPulling="2026-04-17 17:10:24.960232814 +0000 UTC m=+163.032819990" lastFinishedPulling="2026-04-17 17:10:32.027571804 +0000 UTC m=+170.100158990" observedRunningTime="2026-04-17 17:10:34.082596963 +0000 UTC m=+172.155184195" watchObservedRunningTime="2026-04-17 17:10:34.0864655 +0000 UTC m=+172.159052696" Apr 17 17:10:35.062558 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:35.062482 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerStarted","Data":"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a"} Apr 17 17:10:35.064982 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:35.064952 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"1c863578a3cbf017cb38034f087c4d741f9631c8df7d2fdcde4f1e11528fa3b2"} Apr 17 17:10:35.133378 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:35.133328 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=1.455021501 podStartE2EDuration="19.133309228s" podCreationTimestamp="2026-04-17 17:10:16 +0000 UTC" firstStartedPulling="2026-04-17 17:10:17.084728168 +0000 UTC m=+155.157315345" lastFinishedPulling="2026-04-17 17:10:34.76301589 +0000 UTC m=+172.835603072" observedRunningTime="2026-04-17 17:10:35.130802098 +0000 UTC m=+173.203389291" watchObservedRunningTime="2026-04-17 17:10:35.133309228 +0000 UTC m=+173.205896434" Apr 17 17:10:36.069415 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.069380 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-n4m6l" event={"ID":"05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b","Type":"ContainerStarted","Data":"44c55b89621cb53c9d5b6e4b390258d5b2308565b722d78c9a29d8297b03f8a9"} Apr 17 17:10:36.071899 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.071874 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"d3f06db010a98ecfb5977eee710f0106182982907361bd1bd0cffab5c68663e6"} Apr 17 17:10:36.072013 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.071904 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" event={"ID":"d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd","Type":"ContainerStarted","Data":"cc4af84a00efd0d7f6c87fe22db03b9ac3ea59703bb9703c419a613fb72bd7b7"} Apr 17 17:10:36.072072 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.072052 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:36.086878 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.086829 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-n4m6l" podStartSLOduration=139.730839751 podStartE2EDuration="2m22.086814437s" podCreationTimestamp="2026-04-17 17:08:14 +0000 UTC" firstStartedPulling="2026-04-17 17:10:32.709583474 +0000 UTC m=+170.782170646" lastFinishedPulling="2026-04-17 17:10:35.065558154 +0000 UTC m=+173.138145332" observedRunningTime="2026-04-17 17:10:36.086224356 +0000 UTC m=+174.158811564" watchObservedRunningTime="2026-04-17 17:10:36.086814437 +0000 UTC m=+174.159401631" Apr 17 17:10:36.112165 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.112103 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" podStartSLOduration=9.347214097 podStartE2EDuration="19.112086188s" podCreationTimestamp="2026-04-17 17:10:17 +0000 UTC" firstStartedPulling="2026-04-17 17:10:25.000409747 +0000 UTC m=+163.072996944" lastFinishedPulling="2026-04-17 17:10:34.765281843 +0000 UTC m=+172.837869035" observedRunningTime="2026-04-17 17:10:36.108955227 +0000 UTC m=+174.181542457" watchObservedRunningTime="2026-04-17 17:10:36.112086188 +0000 UTC m=+174.184673384" Apr 17 17:10:36.869570 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:36.869536 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:10:37.079765 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:37.079740 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-85d87b4479-nqbnz" Apr 17 17:10:39.083178 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:39.083144 2574 generic.go:358] "Generic (PLEG): container finished" podID="bfd868a7-0269-499a-9b8c-4c2c8d2aba93" containerID="6312cd6dcd9e22218113640e1e4db9bea8fa3d33e3a7d08e35d1289e20db88ef" exitCode=0 Apr 17 17:10:39.083471 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:39.083196 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" event={"ID":"bfd868a7-0269-499a-9b8c-4c2c8d2aba93","Type":"ContainerDied","Data":"6312cd6dcd9e22218113640e1e4db9bea8fa3d33e3a7d08e35d1289e20db88ef"} Apr 17 17:10:39.083526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:39.083513 2574 scope.go:117] "RemoveContainer" containerID="6312cd6dcd9e22218113640e1e4db9bea8fa3d33e3a7d08e35d1289e20db88ef" Apr 17 17:10:40.061500 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:40.061472 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:40.061699 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:40.061547 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:10:40.088288 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:40.088254 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-8hcsn" event={"ID":"bfd868a7-0269-499a-9b8c-4c2c8d2aba93","Type":"ContainerStarted","Data":"ab19c7d4274b4f302347b904abe35b8e33fd58372eb7c9e62b9e6e0b27293e28"} Apr 17 17:10:40.795729 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:40.795699 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m492r_5ec73ca8-f486-4ad3-b44e-4e1b5815d3de/dns/0.log" Apr 17 17:10:40.996088 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:40.996059 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m492r_5ec73ca8-f486-4ad3-b44e-4e1b5815d3de/kube-rbac-proxy/0.log" Apr 17 17:10:41.396130 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:41.396101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-52mt7_b5a9dd27-c914-41cc-88fc-5a64c1169c04/dns-node-resolver/0.log" Apr 17 17:10:41.996359 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:41.996333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n4m6l_05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b/serve-healthcheck-canary/0.log" Apr 17 17:10:43.056407 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:43.056378 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-m492r" Apr 17 17:10:57.047728 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.047673 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-9585658b7-qhzc4" podUID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" containerName="console" containerID="cri-o://9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b" gracePeriod=15 Apr 17 17:10:57.317690 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.317670 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9585658b7-qhzc4_dd0c121c-18b4-4637-ab1a-edd22fb56a23/console/0.log" Apr 17 17:10:57.317796 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.317727 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:57.484714 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484683 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bxgcq\" (UniqueName: \"kubernetes.io/projected/dd0c121c-18b4-4637-ab1a-edd22fb56a23-kube-api-access-bxgcq\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.484714 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484715 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-config\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.484942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484737 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-service-ca\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.484942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484759 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-trusted-ca-bundle\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.484942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484781 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-oauth-serving-cert\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.484942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484877 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-serving-cert\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.484942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.484919 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-oauth-config\") pod \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\" (UID: \"dd0c121c-18b4-4637-ab1a-edd22fb56a23\") " Apr 17 17:10:57.485311 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.485087 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-config" (OuterVolumeSpecName: "console-config") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:57.485311 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.485253 2574 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-config\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:57.485311 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.485288 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-service-ca" (OuterVolumeSpecName: "service-ca") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:57.485415 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.485310 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:57.485448 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.485437 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:10:57.487179 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.487152 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd0c121c-18b4-4637-ab1a-edd22fb56a23-kube-api-access-bxgcq" (OuterVolumeSpecName: "kube-api-access-bxgcq") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "kube-api-access-bxgcq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:10:57.487527 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.487502 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:57.487592 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.487523 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "dd0c121c-18b4-4637-ab1a-edd22fb56a23" (UID: "dd0c121c-18b4-4637-ab1a-edd22fb56a23"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:10:57.586713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.586639 2574 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-serving-cert\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:57.586713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.586666 2574 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/dd0c121c-18b4-4637-ab1a-edd22fb56a23-console-oauth-config\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:57.586713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.586677 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bxgcq\" (UniqueName: \"kubernetes.io/projected/dd0c121c-18b4-4637-ab1a-edd22fb56a23-kube-api-access-bxgcq\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:57.586713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.586686 2574 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-service-ca\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:57.586713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.586694 2574 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-trusted-ca-bundle\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:57.586713 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:57.586703 2574 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/dd0c121c-18b4-4637-ab1a-edd22fb56a23-oauth-serving-cert\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:10:58.148174 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.148148 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-9585658b7-qhzc4_dd0c121c-18b4-4637-ab1a-edd22fb56a23/console/0.log" Apr 17 17:10:58.148523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.148186 2574 generic.go:358] "Generic (PLEG): container finished" podID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" containerID="9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b" exitCode=2 Apr 17 17:10:58.148523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.148217 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9585658b7-qhzc4" event={"ID":"dd0c121c-18b4-4637-ab1a-edd22fb56a23","Type":"ContainerDied","Data":"9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b"} Apr 17 17:10:58.148523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.148249 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-9585658b7-qhzc4" Apr 17 17:10:58.148523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.148260 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-9585658b7-qhzc4" event={"ID":"dd0c121c-18b4-4637-ab1a-edd22fb56a23","Type":"ContainerDied","Data":"056c4f47e166ca25d3aadfb6b28e3fa59ff1b07ab16e5773c70fb4f48162263a"} Apr 17 17:10:58.148523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.148276 2574 scope.go:117] "RemoveContainer" containerID="9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b" Apr 17 17:10:58.156529 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.156514 2574 scope.go:117] "RemoveContainer" containerID="9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b" Apr 17 17:10:58.156798 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:10:58.156778 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b\": container with ID starting with 9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b not found: ID does not exist" containerID="9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b" Apr 17 17:10:58.156866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.156809 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b"} err="failed to get container status \"9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b\": rpc error: code = NotFound desc = could not find container \"9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b\": container with ID starting with 9adb52b7d49a35f89e111bfdc82d03a8f2b548e20fce9b74868e56aa6f785e5b not found: ID does not exist" Apr 17 17:10:58.169230 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.169211 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-9585658b7-qhzc4"] Apr 17 17:10:58.177740 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.177720 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-9585658b7-qhzc4"] Apr 17 17:10:58.435973 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:10:58.435940 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" path="/var/lib/kubelet/pods/dd0c121c-18b4-4637-ab1a-edd22fb56a23/volumes" Apr 17 17:11:00.066249 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:00.066215 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:11:00.069910 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:00.069888 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6c68877994-q8bps" Apr 17 17:11:21.869146 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:21.869110 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:21.888310 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:21.888284 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:22.233641 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:22.233597 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:35.608498 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608465 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:11:35.609000 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608905 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="alertmanager" containerID="cri-o://65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36" gracePeriod=120 Apr 17 17:11:35.609000 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608956 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="prom-label-proxy" containerID="cri-o://f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a" gracePeriod=120 Apr 17 17:11:35.609129 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608979 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-metric" containerID="cri-o://4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d" gracePeriod=120 Apr 17 17:11:35.609129 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608989 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-web" containerID="cri-o://c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8" gracePeriod=120 Apr 17 17:11:35.609129 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608931 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy" containerID="cri-o://eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af" gracePeriod=120 Apr 17 17:11:35.609129 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:35.608998 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="config-reloader" containerID="cri-o://551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0" gracePeriod=120 Apr 17 17:11:36.265421 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265386 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a" exitCode=0 Apr 17 17:11:36.265421 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265413 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af" exitCode=0 Apr 17 17:11:36.265421 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265419 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0" exitCode=0 Apr 17 17:11:36.265421 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265425 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36" exitCode=0 Apr 17 17:11:36.265696 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265444 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a"} Apr 17 17:11:36.265696 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265465 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af"} Apr 17 17:11:36.265696 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265474 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0"} Apr 17 17:11:36.265696 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.265485 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36"} Apr 17 17:11:36.856436 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.856410 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:36.895858 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.895829 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-web\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896033 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.895879 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-main-db\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896033 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.895916 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-main-tls\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896033 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.895939 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-config-out\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896033 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.895961 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-tls-assets\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896033 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.895988 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-web-config\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896283 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896036 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5sgbp\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-kube-api-access-5sgbp\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896283 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896076 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-config-volume\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896283 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896107 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-cluster-tls-config\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896283 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896136 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-trusted-ca-bundle\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896283 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896161 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-metrics-client-ca\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.896283 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896202 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-metric\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.897373 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.897186 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:36.901356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.899385 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy\") pod \"2197f45c-d49b-4455-816f-9fb39452dd65\" (UID: \"2197f45c-d49b-4455-816f-9fb39452dd65\") " Apr 17 17:11:36.901356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.900005 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:36.901356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.900046 2574 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-metrics-client-ca\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:36.901356 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.896222 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:11:36.902147 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.902121 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:36.902385 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.902357 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-kube-api-access-5sgbp" (OuterVolumeSpecName: "kube-api-access-5sgbp") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "kube-api-access-5sgbp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:11:36.902457 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.902422 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-config-out" (OuterVolumeSpecName: "config-out") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:11:36.902511 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.902496 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:36.903307 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.902658 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:11:36.912083 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.912019 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:36.912291 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.911976 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:36.912393 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.912367 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-config-volume" (OuterVolumeSpecName: "config-volume") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:36.916069 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.916042 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:36.919631 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:36.919579 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-web-config" (OuterVolumeSpecName: "web-config") pod "2197f45c-d49b-4455-816f-9fb39452dd65" (UID: "2197f45c-d49b-4455-816f-9fb39452dd65"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:37.001158 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001121 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5sgbp\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-kube-api-access-5sgbp\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001158 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001152 2574 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-config-volume\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001158 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001161 2574 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-cluster-tls-config\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001172 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001183 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001193 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001202 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001210 2574 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-alertmanager-main-db\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001218 2574 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-secret-alertmanager-main-tls\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001227 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/2197f45c-d49b-4455-816f-9fb39452dd65-config-out\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001234 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/2197f45c-d49b-4455-816f-9fb39452dd65-tls-assets\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.001376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.001243 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/2197f45c-d49b-4455-816f-9fb39452dd65-web-config\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:37.270866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270829 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d" exitCode=0 Apr 17 17:11:37.270866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270862 2574 generic.go:358] "Generic (PLEG): container finished" podID="2197f45c-d49b-4455-816f-9fb39452dd65" containerID="c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8" exitCode=0 Apr 17 17:11:37.270866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270860 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d"} Apr 17 17:11:37.271154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270898 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8"} Apr 17 17:11:37.271154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270908 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"2197f45c-d49b-4455-816f-9fb39452dd65","Type":"ContainerDied","Data":"63ac469d9718c5cae85bd3c31c5e2ba89dddc1ee942a3dd62ff6dfa51fee4c94"} Apr 17 17:11:37.271154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270922 2574 scope.go:117] "RemoveContainer" containerID="f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a" Apr 17 17:11:37.271154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.270966 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.278955 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.278939 2574 scope.go:117] "RemoveContainer" containerID="4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d" Apr 17 17:11:37.285831 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.285808 2574 scope.go:117] "RemoveContainer" containerID="eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af" Apr 17 17:11:37.293002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.292981 2574 scope.go:117] "RemoveContainer" containerID="c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8" Apr 17 17:11:37.294273 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.294256 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:11:37.299182 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.299155 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:11:37.301002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.300980 2574 scope.go:117] "RemoveContainer" containerID="551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0" Apr 17 17:11:37.307386 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.307367 2574 scope.go:117] "RemoveContainer" containerID="65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36" Apr 17 17:11:37.313891 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.313874 2574 scope.go:117] "RemoveContainer" containerID="fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72" Apr 17 17:11:37.320650 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.320631 2574 scope.go:117] "RemoveContainer" containerID="f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a" Apr 17 17:11:37.320892 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.320873 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a\": container with ID starting with f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a not found: ID does not exist" containerID="f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a" Apr 17 17:11:37.320936 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.320906 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a"} err="failed to get container status \"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a\": rpc error: code = NotFound desc = could not find container \"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a\": container with ID starting with f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a not found: ID does not exist" Apr 17 17:11:37.320936 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.320927 2574 scope.go:117] "RemoveContainer" containerID="4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d" Apr 17 17:11:37.321178 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.321161 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d\": container with ID starting with 4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d not found: ID does not exist" containerID="4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d" Apr 17 17:11:37.321240 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321182 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d"} err="failed to get container status \"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d\": rpc error: code = NotFound desc = could not find container \"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d\": container with ID starting with 4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d not found: ID does not exist" Apr 17 17:11:37.321240 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321197 2574 scope.go:117] "RemoveContainer" containerID="eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af" Apr 17 17:11:37.321397 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.321372 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af\": container with ID starting with eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af not found: ID does not exist" containerID="eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af" Apr 17 17:11:37.321434 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321400 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af"} err="failed to get container status \"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af\": rpc error: code = NotFound desc = could not find container \"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af\": container with ID starting with eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af not found: ID does not exist" Apr 17 17:11:37.321434 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321414 2574 scope.go:117] "RemoveContainer" containerID="c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8" Apr 17 17:11:37.321584 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.321568 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8\": container with ID starting with c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8 not found: ID does not exist" containerID="c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8" Apr 17 17:11:37.321655 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321587 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8"} err="failed to get container status \"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8\": rpc error: code = NotFound desc = could not find container \"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8\": container with ID starting with c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8 not found: ID does not exist" Apr 17 17:11:37.321655 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321598 2574 scope.go:117] "RemoveContainer" containerID="551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0" Apr 17 17:11:37.321825 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.321803 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0\": container with ID starting with 551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0 not found: ID does not exist" containerID="551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0" Apr 17 17:11:37.321871 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321834 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0"} err="failed to get container status \"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0\": rpc error: code = NotFound desc = could not find container \"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0\": container with ID starting with 551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0 not found: ID does not exist" Apr 17 17:11:37.321871 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.321857 2574 scope.go:117] "RemoveContainer" containerID="65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36" Apr 17 17:11:37.322075 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.322060 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36\": container with ID starting with 65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36 not found: ID does not exist" containerID="65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36" Apr 17 17:11:37.322123 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322078 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36"} err="failed to get container status \"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36\": rpc error: code = NotFound desc = could not find container \"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36\": container with ID starting with 65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36 not found: ID does not exist" Apr 17 17:11:37.322123 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322091 2574 scope.go:117] "RemoveContainer" containerID="fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72" Apr 17 17:11:37.322268 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:37.322254 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72\": container with ID starting with fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72 not found: ID does not exist" containerID="fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72" Apr 17 17:11:37.322305 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322271 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72"} err="failed to get container status \"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72\": rpc error: code = NotFound desc = could not find container \"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72\": container with ID starting with fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72 not found: ID does not exist" Apr 17 17:11:37.322305 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322282 2574 scope.go:117] "RemoveContainer" containerID="f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a" Apr 17 17:11:37.322457 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322440 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a"} err="failed to get container status \"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a\": rpc error: code = NotFound desc = could not find container \"f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a\": container with ID starting with f0ba25f4f6ade2542733242654f9a0097bdce8d09b5ba311a28271a97bc7d99a not found: ID does not exist" Apr 17 17:11:37.322492 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322458 2574 scope.go:117] "RemoveContainer" containerID="4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d" Apr 17 17:11:37.322649 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322631 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d"} err="failed to get container status \"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d\": rpc error: code = NotFound desc = could not find container \"4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d\": container with ID starting with 4741fda2bfdad0a02488914aa820475b6494a1720a6fac6d9f53543148e6787d not found: ID does not exist" Apr 17 17:11:37.322708 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322648 2574 scope.go:117] "RemoveContainer" containerID="eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af" Apr 17 17:11:37.322850 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322835 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af"} err="failed to get container status \"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af\": rpc error: code = NotFound desc = could not find container \"eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af\": container with ID starting with eafaf0381f4b7f26027cdfc842f4bad767319599bac44e7093b2ebe12a62c7af not found: ID does not exist" Apr 17 17:11:37.322892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.322850 2574 scope.go:117] "RemoveContainer" containerID="c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8" Apr 17 17:11:37.323058 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323038 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8"} err="failed to get container status \"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8\": rpc error: code = NotFound desc = could not find container \"c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8\": container with ID starting with c61a8f9730dc163756caff9ae7af1a98aa0616c269ec252da06b677cdc3104f8 not found: ID does not exist" Apr 17 17:11:37.323103 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323059 2574 scope.go:117] "RemoveContainer" containerID="551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0" Apr 17 17:11:37.323232 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323217 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0"} err="failed to get container status \"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0\": rpc error: code = NotFound desc = could not find container \"551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0\": container with ID starting with 551251db672395fbb22708bca0ca7e8d9b83023320ee9df6bdf2442085a4a7e0 not found: ID does not exist" Apr 17 17:11:37.323267 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323233 2574 scope.go:117] "RemoveContainer" containerID="65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36" Apr 17 17:11:37.323382 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323368 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36"} err="failed to get container status \"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36\": rpc error: code = NotFound desc = could not find container \"65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36\": container with ID starting with 65513fb7b4188088101a4ca10683b51305cd8c51c85fe93b1c527b9c349b8b36 not found: ID does not exist" Apr 17 17:11:37.323422 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323382 2574 scope.go:117] "RemoveContainer" containerID="fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72" Apr 17 17:11:37.323564 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.323549 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72"} err="failed to get container status \"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72\": rpc error: code = NotFound desc = could not find container \"fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72\": container with ID starting with fff3396d097886f7243cae00f1d492c652740e64918f08ee66b98c3bd5b5af72 not found: ID does not exist" Apr 17 17:11:37.326500 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326480 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:11:37.326843 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326830 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="config-reloader" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326847 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="config-reloader" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326860 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326866 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326875 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-metric" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326880 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-metric" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326887 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="alertmanager" Apr 17 17:11:37.326892 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326892 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="alertmanager" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326900 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" containerName="console" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326905 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" containerName="console" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326913 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="prom-label-proxy" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326918 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="prom-label-proxy" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326942 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" containerName="registry" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326947 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" containerName="registry" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326954 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-web" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326959 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-web" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326968 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="init-config-reloader" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.326974 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="init-config-reloader" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327026 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="4adf7a08-e39c-4ebd-9fb8-8989bbdd8960" containerName="registry" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327036 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd0c121c-18b4-4637-ab1a-edd22fb56a23" containerName="console" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327043 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-metric" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327051 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="prom-label-proxy" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327058 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327065 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="kube-rbac-proxy-web" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327071 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="config-reloader" Apr 17 17:11:37.327089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.327077 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" containerName="alertmanager" Apr 17 17:11:37.332102 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.332085 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.334797 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.334765 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 17 17:11:37.334895 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.334767 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 17 17:11:37.334895 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.334767 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 17 17:11:37.335021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.334927 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 17 17:11:37.335165 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.335149 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 17 17:11:37.335336 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.335322 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-2r8lw\"" Apr 17 17:11:37.335384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.335343 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 17 17:11:37.335384 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.335373 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 17 17:11:37.335481 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.335461 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 17 17:11:37.342583 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.342560 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 17 17:11:37.344201 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.344183 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:11:37.404672 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404631 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404672 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404672 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-web-config\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404694 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03c6c110-7db0-4d40-950f-63e925f382eb-config-out\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404726 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404742 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxp7n\" (UniqueName: \"kubernetes.io/projected/03c6c110-7db0-4d40-950f-63e925f382eb-kube-api-access-mxp7n\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404795 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03c6c110-7db0-4d40-950f-63e925f382eb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404835 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03c6c110-7db0-4d40-950f-63e925f382eb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404863 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-config-volume\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.404916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404898 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03c6c110-7db0-4d40-950f-63e925f382eb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.405205 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404940 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.405205 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.404995 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.405205 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.405022 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c6c110-7db0-4d40-950f-63e925f382eb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.405205 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.405102 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506170 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506216 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c6c110-7db0-4d40-950f-63e925f382eb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506281 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506311 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506345 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-web-config\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506470 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03c6c110-7db0-4d40-950f-63e925f382eb-config-out\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506516 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506541 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxp7n\" (UniqueName: \"kubernetes.io/projected/03c6c110-7db0-4d40-950f-63e925f382eb-kube-api-access-mxp7n\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03c6c110-7db0-4d40-950f-63e925f382eb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506599 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03c6c110-7db0-4d40-950f-63e925f382eb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506655 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-config-volume\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506689 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03c6c110-7db0-4d40-950f-63e925f382eb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.506882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.506724 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.507632 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.507406 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/03c6c110-7db0-4d40-950f-63e925f382eb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.507740 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.507712 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/03c6c110-7db0-4d40-950f-63e925f382eb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.508519 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.508493 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/03c6c110-7db0-4d40-950f-63e925f382eb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.509548 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.509524 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-web-config\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.509660 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.509554 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.509719 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.509662 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/03c6c110-7db0-4d40-950f-63e925f382eb-config-out\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.509909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.509884 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.510093 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.510068 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-config-volume\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.510297 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.510279 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.510508 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.510488 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.510743 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.510725 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/03c6c110-7db0-4d40-950f-63e925f382eb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.510866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.510850 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/03c6c110-7db0-4d40-950f-63e925f382eb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.516054 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.516033 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxp7n\" (UniqueName: \"kubernetes.io/projected/03c6c110-7db0-4d40-950f-63e925f382eb-kube-api-access-mxp7n\") pod \"alertmanager-main-0\" (UID: \"03c6c110-7db0-4d40-950f-63e925f382eb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.643352 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.643273 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 17 17:11:37.806271 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:37.806249 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 17 17:11:37.808373 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:11:37.808339 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03c6c110_7db0_4d40_950f_63e925f382eb.slice/crio-2b3e2905f2e49002f7586ed8a27db42e375a881e5b2819b10f43a48b7302c25b WatchSource:0}: Error finding container 2b3e2905f2e49002f7586ed8a27db42e375a881e5b2819b10f43a48b7302c25b: Status 404 returned error can't find the container with id 2b3e2905f2e49002f7586ed8a27db42e375a881e5b2819b10f43a48b7302c25b Apr 17 17:11:38.274855 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:38.274820 2574 generic.go:358] "Generic (PLEG): container finished" podID="03c6c110-7db0-4d40-950f-63e925f382eb" containerID="07583bd699c5a8f9734c17af19cd4a15db2cd0455f6b6f9970892a5769af448c" exitCode=0 Apr 17 17:11:38.275259 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:38.274908 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerDied","Data":"07583bd699c5a8f9734c17af19cd4a15db2cd0455f6b6f9970892a5769af448c"} Apr 17 17:11:38.275259 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:38.274942 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"2b3e2905f2e49002f7586ed8a27db42e375a881e5b2819b10f43a48b7302c25b"} Apr 17 17:11:38.436617 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:38.436574 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2197f45c-d49b-4455-816f-9fb39452dd65" path="/var/lib/kubelet/pods/2197f45c-d49b-4455-816f-9fb39452dd65/volumes" Apr 17 17:11:39.282523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.282483 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"bacbf94f82a81b0846124c0c374faa1023b8e952d9b140038ef46d14ae2e65ba"} Apr 17 17:11:39.282523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.282522 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"800bee5473cf5dcccad75d723e9277cdb25a06c9f0384d2db0ec05a4b3805d67"} Apr 17 17:11:39.282523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.282533 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"b7a66abeda8d00d0b77971e3cdecbb809477eed263bda7f740f6b33a6adfeba9"} Apr 17 17:11:39.283088 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.282541 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"521caa2ba54fc91e7ce1e153f0133bf92290340c35442752c2614aded10e3e05"} Apr 17 17:11:39.283088 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.282548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"b47eb7f3abe505d507196de942422ba258c0e89bcac0db9e168859410fbdf8e0"} Apr 17 17:11:39.283088 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.282556 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"03c6c110-7db0-4d40-950f-63e925f382eb","Type":"ContainerStarted","Data":"33d11961335264ae5feb8a340a07b0f22043cb94e806232a648b2bc67d1ae10d"} Apr 17 17:11:39.309358 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.309301 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.309281845 podStartE2EDuration="2.309281845s" podCreationTimestamp="2026-04-17 17:11:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:11:39.306842601 +0000 UTC m=+237.379429795" watchObservedRunningTime="2026-04-17 17:11:39.309281845 +0000 UTC m=+237.381869040" Apr 17 17:11:39.646798 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.646712 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-748749ff56-4pjhb"] Apr 17 17:11:39.650537 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.650515 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.653172 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.653145 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 17 17:11:39.653273 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.653182 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 17 17:11:39.653273 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.653190 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-w6xfj\"" Apr 17 17:11:39.653273 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.653226 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 17 17:11:39.653523 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.653508 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 17 17:11:39.653591 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.653555 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 17 17:11:39.658279 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.658260 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 17 17:11:39.662673 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.662654 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-748749ff56-4pjhb"] Apr 17 17:11:39.727807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.727771 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-serving-certs-ca-bundle\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.727807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.727817 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-federate-client-tls\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.728071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.727841 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-metrics-client-ca\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.728071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.727859 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjt89\" (UniqueName: \"kubernetes.io/projected/47318264-9cd9-4f06-b7d4-17db7406977f-kube-api-access-vjt89\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.728071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.727895 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-secret-telemeter-client\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.728071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.727925 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-telemeter-client-tls\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.728071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.728014 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.728254 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.728069 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829136 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829105 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-telemeter-client-tls\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829145 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829209 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-serving-certs-ca-bundle\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-federate-client-tls\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829253 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-metrics-client-ca\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829572 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829421 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjt89\" (UniqueName: \"kubernetes.io/projected/47318264-9cd9-4f06-b7d4-17db7406977f-kube-api-access-vjt89\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.829572 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.829490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-secret-telemeter-client\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.830150 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.830121 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-metrics-client-ca\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.830258 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.830237 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-serving-certs-ca-bundle\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.830387 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.830362 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/47318264-9cd9-4f06-b7d4-17db7406977f-telemeter-trusted-ca-bundle\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.831888 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.831865 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-federate-client-tls\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.831979 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.831957 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.832074 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.832055 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-telemeter-client-tls\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.832112 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.832084 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/47318264-9cd9-4f06-b7d4-17db7406977f-secret-telemeter-client\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.837578 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.837559 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjt89\" (UniqueName: \"kubernetes.io/projected/47318264-9cd9-4f06-b7d4-17db7406977f-kube-api-access-vjt89\") pod \"telemeter-client-748749ff56-4pjhb\" (UID: \"47318264-9cd9-4f06-b7d4-17db7406977f\") " pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:39.928662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.928630 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:11:39.929304 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.929209 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="prometheus" containerID="cri-o://0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" gracePeriod=600 Apr 17 17:11:39.929304 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.929245 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="config-reloader" containerID="cri-o://c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" gracePeriod=600 Apr 17 17:11:39.929304 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.929254 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="thanos-sidecar" containerID="cri-o://0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" gracePeriod=600 Apr 17 17:11:39.929552 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.929258 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy" containerID="cri-o://5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" gracePeriod=600 Apr 17 17:11:39.929552 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.929212 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-thanos" containerID="cri-o://dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" gracePeriod=600 Apr 17 17:11:39.929552 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.929233 2574 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-web" containerID="cri-o://20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" gracePeriod=600 Apr 17 17:11:39.960776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:39.960751 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" Apr 17 17:11:40.093591 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.093493 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-748749ff56-4pjhb"] Apr 17 17:11:40.096165 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:11:40.096138 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47318264_9cd9_4f06_b7d4_17db7406977f.slice/crio-cf7d6b6b90b4f1c738efeac167b06a4f2bc08943ec8a74777930c8b27c4f3d95 WatchSource:0}: Error finding container cf7d6b6b90b4f1c738efeac167b06a4f2bc08943ec8a74777930c8b27c4f3d95: Status 404 returned error can't find the container with id cf7d6b6b90b4f1c738efeac167b06a4f2bc08943ec8a74777930c8b27c4f3d95 Apr 17 17:11:40.173533 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.173511 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.232770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.232673 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xznl7\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-kube-api-access-xznl7\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.232770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.232725 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-thanos-prometheus-http-client-file\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.232770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.232760 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-web-config\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.232886 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-trusted-ca-bundle\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.232927 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.232956 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233159 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233079 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-db\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233159 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233139 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-rulefiles-0\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233243 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233190 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-kube-rbac-proxy\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233243 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233216 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-tls-assets\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233336 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233247 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233336 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233276 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-serving-certs-ca-bundle\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233514 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233437 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-kubelet-serving-ca-bundle\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.233514 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233523 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-metrics-client-ca\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.234807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233579 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-tls\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.234807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233623 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-grpc-tls\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.234807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233654 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-metrics-client-certs\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.234807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.233687 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config-out\") pod \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\" (UID: \"36d3d403-fa9a-4679-afd9-a6aee6c4b292\") " Apr 17 17:11:40.234807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.234399 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:40.234807 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.234434 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:40.235486 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.235456 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:11:40.236367 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.235595 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:40.236367 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.236177 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:40.236367 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.236218 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:11:40.236367 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.236286 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.236695 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.236647 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.236794 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.236709 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-kube-api-access-xznl7" (OuterVolumeSpecName: "kube-api-access-xznl7") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "kube-api-access-xznl7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:11:40.236976 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.236946 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.238341 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.238313 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.238531 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.238506 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.238651 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.238565 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config" (OuterVolumeSpecName: "config") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.238720 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.238691 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.239192 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.239165 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:11:40.239401 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.239378 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config-out" (OuterVolumeSpecName: "config-out") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:11:40.239856 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.239837 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.248104 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.248082 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-web-config" (OuterVolumeSpecName: "web-config") pod "36d3d403-fa9a-4679-afd9-a6aee6c4b292" (UID: "36d3d403-fa9a-4679-afd9-a6aee6c4b292"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:11:40.288313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288283 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" exitCode=0 Apr 17 17:11:40.288313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288307 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" exitCode=0 Apr 17 17:11:40.288313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288316 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" exitCode=0 Apr 17 17:11:40.288313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288323 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" exitCode=0 Apr 17 17:11:40.288313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288329 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" exitCode=0 Apr 17 17:11:40.288313 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288334 2574 generic.go:358] "Generic (PLEG): container finished" podID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" exitCode=0 Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288414 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288410 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288546 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288560 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288562 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288572 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288586 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} Apr 17 17:11:40.288984 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.288599 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"36d3d403-fa9a-4679-afd9-a6aee6c4b292","Type":"ContainerDied","Data":"55e84bfb53fb04dbd0970800db403c6a87806fdbcd77b2782472386e811ee3f3"} Apr 17 17:11:40.289628 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.289551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" event={"ID":"47318264-9cd9-4f06-b7d4-17db7406977f","Type":"ContainerStarted","Data":"cf7d6b6b90b4f1c738efeac167b06a4f2bc08943ec8a74777930c8b27c4f3d95"} Apr 17 17:11:40.299439 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.299357 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.306574 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.306554 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.311317 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.311296 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:11:40.314757 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.314673 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.315010 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.314992 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:11:40.321159 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.321143 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.327372 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.327355 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.334346 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334331 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.334632 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334580 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-tls\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334632 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334621 2574 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-grpc-tls\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334632 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334635 2574 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-metrics-client-certs\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334644 2574 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config-out\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334653 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xznl7\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-kube-api-access-xznl7\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334661 2574 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-thanos-prometheus-http-client-file\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334670 2574 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-web-config\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334679 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-trusted-ca-bundle\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334689 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334698 2574 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334707 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-db\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334715 2574 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334728 2574 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-secret-kube-rbac-proxy\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334742 2574 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/36d3d403-fa9a-4679-afd9-a6aee6c4b292-tls-assets\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334754 2574 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/36d3d403-fa9a-4679-afd9-a6aee6c4b292-config\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334767 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334782 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.334793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.334795 2574 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/36d3d403-fa9a-4679-afd9-a6aee6c4b292-configmap-metrics-client-ca\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:11:40.339067 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339045 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:11:40.339385 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339371 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="init-config-reloader" Apr 17 17:11:40.339449 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339389 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="init-config-reloader" Apr 17 17:11:40.339449 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339402 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="config-reloader" Apr 17 17:11:40.339449 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339410 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="config-reloader" Apr 17 17:11:40.339449 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339421 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy" Apr 17 17:11:40.339449 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339430 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339451 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-thanos" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339460 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-thanos" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339472 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="prometheus" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339479 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="prometheus" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339495 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="thanos-sidecar" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339503 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="thanos-sidecar" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339513 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-web" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339521 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-web" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339650 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="config-reloader" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339665 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-thanos" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339678 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy-web" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339688 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="thanos-sidecar" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339698 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="kube-rbac-proxy" Apr 17 17:11:40.339722 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.339710 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" containerName="prometheus" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.341708 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.342002 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342031 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} err="failed to get container status \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342056 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.342291 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342317 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} err="failed to get container status \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342344 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.342547 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342577 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} err="failed to get container status \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342598 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.342837 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342861 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} err="failed to get container status \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.342881 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.343097 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343119 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} err="failed to get container status \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" Apr 17 17:11:40.344284 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343136 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.343346 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343377 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} err="failed to get container status \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343395 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:11:40.343626 2574 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343658 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} err="failed to get container status \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343676 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343896 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} err="failed to get container status \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.343917 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344145 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} err="failed to get container status \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344164 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344400 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} err="failed to get container status \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344422 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344673 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} err="failed to get container status \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344709 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344972 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} err="failed to get container status \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" Apr 17 17:11:40.345029 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.344994 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.345705 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.345303 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} err="failed to get container status \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" Apr 17 17:11:40.345705 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.345339 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.345705 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.345593 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} err="failed to get container status \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" Apr 17 17:11:40.345705 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.345634 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.346235 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.345870 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} err="failed to get container status \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" Apr 17 17:11:40.346235 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.345892 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.346235 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.346094 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} err="failed to get container status \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" Apr 17 17:11:40.346235 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.346113 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.346827 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.346807 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} err="failed to get container status \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" Apr 17 17:11:40.346896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.346828 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.347083 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347065 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} err="failed to get container status \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" Apr 17 17:11:40.347127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347092 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.347322 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347304 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} err="failed to get container status \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" Apr 17 17:11:40.347387 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347325 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.347564 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347540 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} err="failed to get container status \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" Apr 17 17:11:40.347564 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347563 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.347765 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347750 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} err="failed to get container status \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" Apr 17 17:11:40.347802 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347765 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.347947 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347932 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} err="failed to get container status \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" Apr 17 17:11:40.347988 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.347947 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.348157 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.348127 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} err="failed to get container status \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" Apr 17 17:11:40.348221 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.348162 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.348467 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.348442 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} err="failed to get container status \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" Apr 17 17:11:40.348515 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.348472 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.348906 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.348887 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} err="failed to get container status \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" Apr 17 17:11:40.348991 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.348906 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.349169 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349152 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} err="failed to get container status \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" Apr 17 17:11:40.349238 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349171 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.349238 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349173 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.349566 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349407 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} err="failed to get container status \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" Apr 17 17:11:40.349566 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349433 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.349837 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349816 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} err="failed to get container status \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" Apr 17 17:11:40.349911 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.349838 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.350096 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350075 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} err="failed to get container status \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" Apr 17 17:11:40.350096 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350096 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.350348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350311 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} err="failed to get container status \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" Apr 17 17:11:40.350439 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350350 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.350573 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350552 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} err="failed to get container status \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" Apr 17 17:11:40.350648 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350574 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.350824 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350805 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} err="failed to get container status \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" Apr 17 17:11:40.350870 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350825 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.351011 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.350992 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} err="failed to get container status \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" Apr 17 17:11:40.351066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351011 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.351221 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351199 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} err="failed to get container status \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" Apr 17 17:11:40.351257 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351223 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.351436 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351418 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} err="failed to get container status \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" Apr 17 17:11:40.351498 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351438 2574 scope.go:117] "RemoveContainer" containerID="dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3" Apr 17 17:11:40.351670 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351649 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3"} err="failed to get container status \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": rpc error: code = NotFound desc = could not find container \"dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3\": container with ID starting with dcd1a87bdea7e4cfce828546d3db4e6e0190b83179411dbbb0d724335e48daf3 not found: ID does not exist" Apr 17 17:11:40.351730 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351670 2574 scope.go:117] "RemoveContainer" containerID="5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62" Apr 17 17:11:40.351945 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351930 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 17 17:11:40.352021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351919 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62"} err="failed to get container status \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": rpc error: code = NotFound desc = could not find container \"5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62\": container with ID starting with 5eed2f8a3094cc49c2d2904427592a585eeb256b3ff048819b744a3ceefecc62 not found: ID does not exist" Apr 17 17:11:40.352021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351969 2574 scope.go:117] "RemoveContainer" containerID="20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e" Apr 17 17:11:40.352021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.351992 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-rs8sk\"" Apr 17 17:11:40.352158 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352068 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-6tftur1in7ftc\"" Apr 17 17:11:40.352158 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352087 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 17 17:11:40.352251 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352231 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 17 17:11:40.352400 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352375 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e"} err="failed to get container status \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": rpc error: code = NotFound desc = could not find container \"20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e\": container with ID starting with 20c7034ee11177051752be13420494cbdcd439c13ecc8fc9d05008001bf9494e not found: ID does not exist" Apr 17 17:11:40.352466 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352402 2574 scope.go:117] "RemoveContainer" containerID="0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5" Apr 17 17:11:40.352466 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352436 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 17 17:11:40.352556 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352461 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 17 17:11:40.352556 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352473 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 17 17:11:40.352761 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352652 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 17 17:11:40.352808 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352750 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5"} err="failed to get container status \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": rpc error: code = NotFound desc = could not find container \"0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5\": container with ID starting with 0b32e9d6f3c6de0483e8ce9129571d15d889670d61f44852f41104419dcd4af5 not found: ID does not exist" Apr 17 17:11:40.352808 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352772 2574 scope.go:117] "RemoveContainer" containerID="c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1" Apr 17 17:11:40.352881 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352811 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 17 17:11:40.352881 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.352847 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 17 17:11:40.353122 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.353053 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1"} err="failed to get container status \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": rpc error: code = NotFound desc = could not find container \"c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1\": container with ID starting with c28c573414b85d326d0b368d268c605aa00a9e2938b74d3743cbdccb9e1dd2f1 not found: ID does not exist" Apr 17 17:11:40.353122 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.353121 2574 scope.go:117] "RemoveContainer" containerID="0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110" Apr 17 17:11:40.353300 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.353283 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 17 17:11:40.353772 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.353732 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110"} err="failed to get container status \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": rpc error: code = NotFound desc = could not find container \"0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110\": container with ID starting with 0234c21f1edfab7722a854f0230d31a9e20c44a840b8a5ac5043c53f8c5cc110 not found: ID does not exist" Apr 17 17:11:40.353772 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.353757 2574 scope.go:117] "RemoveContainer" containerID="0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca" Apr 17 17:11:40.355290 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.354964 2574 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca"} err="failed to get container status \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": rpc error: code = NotFound desc = could not find container \"0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca\": container with ID starting with 0f52b8edae6e1cdbd8ab0e3d5ec1009a6a15bb6c3389e0e6a14d469f068422ca not found: ID does not exist" Apr 17 17:11:40.356853 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.356823 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 17 17:11:40.357220 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.357201 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:11:40.358342 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.358040 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 17 17:11:40.435187 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435157 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435187 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435188 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435238 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-web-config\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435280 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435300 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435329 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435367 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435392 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/996f172c-64c2-4d8e-a0da-d47aa9247ebf-kube-api-access-88slz\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435421 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435842 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435563 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435842 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435589 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435842 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435641 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435842 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435670 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/996f172c-64c2-4d8e-a0da-d47aa9247ebf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.435842 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435796 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.436071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435855 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.436071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435886 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-config\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.436071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435948 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/996f172c-64c2-4d8e-a0da-d47aa9247ebf-config-out\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.436071 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.435979 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.436632 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.436575 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36d3d403-fa9a-4679-afd9-a6aee6c4b292" path="/var/lib/kubelet/pods/36d3d403-fa9a-4679-afd9-a6aee6c4b292/volumes" Apr 17 17:11:40.537454 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537365 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/996f172c-64c2-4d8e-a0da-d47aa9247ebf-config-out\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537454 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537406 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537454 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537447 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537468 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537490 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-web-config\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537517 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537536 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537554 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.537750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537748 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538026 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/996f172c-64c2-4d8e-a0da-d47aa9247ebf-kube-api-access-88slz\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538026 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537864 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538026 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537938 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538026 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.537993 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.538036 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.538090 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.538121 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/996f172c-64c2-4d8e-a0da-d47aa9247ebf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.538194 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.538388 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.538231 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.539563 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.539173 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-config\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.540081 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.540054 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.540440 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.540417 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.540544 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.538927 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.541003 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.540939 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/996f172c-64c2-4d8e-a0da-d47aa9247ebf-config-out\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.541273 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.541226 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.541471 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.541447 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.541899 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.541875 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-web-config\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.542285 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.542261 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.542468 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.542424 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-config\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.542899 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.542877 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.543046 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.543027 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.543215 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.543195 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.543270 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.543257 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.543305 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.543283 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/996f172c-64c2-4d8e-a0da-d47aa9247ebf-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.543747 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.543735 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/996f172c-64c2-4d8e-a0da-d47aa9247ebf-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.544369 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.544349 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/996f172c-64c2-4d8e-a0da-d47aa9247ebf-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.546325 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.546304 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-88slz\" (UniqueName: \"kubernetes.io/projected/996f172c-64c2-4d8e-a0da-d47aa9247ebf-kube-api-access-88slz\") pod \"prometheus-k8s-0\" (UID: \"996f172c-64c2-4d8e-a0da-d47aa9247ebf\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.661214 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.661175 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:11:40.792937 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:40.792905 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 17 17:11:40.795957 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:11:40.795926 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod996f172c_64c2_4d8e_a0da_d47aa9247ebf.slice/crio-c1a0af840bc0f377f22f3b63c400b64bc99b5ee1f71563d544517612b11c67bd WatchSource:0}: Error finding container c1a0af840bc0f377f22f3b63c400b64bc99b5ee1f71563d544517612b11c67bd: Status 404 returned error can't find the container with id c1a0af840bc0f377f22f3b63c400b64bc99b5ee1f71563d544517612b11c67bd Apr 17 17:11:41.293680 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:41.293645 2574 generic.go:358] "Generic (PLEG): container finished" podID="996f172c-64c2-4d8e-a0da-d47aa9247ebf" containerID="8908139f0264e3c7c10062e2e8b622441ce4605817898cde5b898c72c8377701" exitCode=0 Apr 17 17:11:41.294172 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:41.293732 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerDied","Data":"8908139f0264e3c7c10062e2e8b622441ce4605817898cde5b898c72c8377701"} Apr 17 17:11:41.294172 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:41.293769 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"c1a0af840bc0f377f22f3b63c400b64bc99b5ee1f71563d544517612b11c67bd"} Apr 17 17:11:42.300376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.300347 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"d089614379b5213fc3344c47861cd88c3d96ec74c4af18ac2204368cc93a77b8"} Apr 17 17:11:42.300376 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.300379 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"870564ebb804fcf69123c0e3754f1de3cd73f4680a099b7b6d1e53ad636f62b1"} Apr 17 17:11:42.300766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.300390 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"e31c5d56272c1b177153cd53cc9b0a9cb39dc6a36fdb574a8342ff155650979e"} Apr 17 17:11:42.300766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.300398 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"5bed2056068c13fbeda9fde4a66aa463266512e1f0b0c8ee4e94e0aecd4b2f44"} Apr 17 17:11:42.300766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.300407 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"3fb962bbd643e90c16b77ea1ed5be25eadabb633dd2ae78e0faea42c96130ee2"} Apr 17 17:11:42.300766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.300414 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"996f172c-64c2-4d8e-a0da-d47aa9247ebf","Type":"ContainerStarted","Data":"59a1334d39ed606a1f8a2c395a319fba37f05feec9e644d03ff764a3720d18ae"} Apr 17 17:11:42.330976 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:42.330930 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.330915323 podStartE2EDuration="2.330915323s" podCreationTimestamp="2026-04-17 17:11:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:11:42.328645335 +0000 UTC m=+240.401232542" watchObservedRunningTime="2026-04-17 17:11:42.330915323 +0000 UTC m=+240.403502516" Apr 17 17:11:43.304799 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:43.304760 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" event={"ID":"47318264-9cd9-4f06-b7d4-17db7406977f","Type":"ContainerStarted","Data":"c1c2b63f8b954e60cd54ed03f4abbfa085ba927fa57929b89870c23f72eeaf95"} Apr 17 17:11:43.304799 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:43.304803 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" event={"ID":"47318264-9cd9-4f06-b7d4-17db7406977f","Type":"ContainerStarted","Data":"ddb133939abbc1323eccab71a36cbc0985efe86ccb7da8589768b2031d5628a1"} Apr 17 17:11:43.305243 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:43.304812 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" event={"ID":"47318264-9cd9-4f06-b7d4-17db7406977f","Type":"ContainerStarted","Data":"bbdb3eb7abacf3105808ff8a72cfe29ff57a247e2b8140281c41c5e02be111d1"} Apr 17 17:11:43.327821 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:43.327774 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-748749ff56-4pjhb" podStartSLOduration=2.094054667 podStartE2EDuration="4.327760539s" podCreationTimestamp="2026-04-17 17:11:39 +0000 UTC" firstStartedPulling="2026-04-17 17:11:40.098080024 +0000 UTC m=+238.170667196" lastFinishedPulling="2026-04-17 17:11:42.331785895 +0000 UTC m=+240.404373068" observedRunningTime="2026-04-17 17:11:43.325513762 +0000 UTC m=+241.398100956" watchObservedRunningTime="2026-04-17 17:11:43.327760539 +0000 UTC m=+241.400347732" Apr 17 17:11:45.662135 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:11:45.662099 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:12:40.661618 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:40.661579 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:12:40.677911 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:40.677882 2574 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:12:41.496459 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:41.496431 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 17 17:12:42.337068 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:42.337004 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:12:42.338192 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:42.338170 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:12:42.343306 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:42.343285 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:12:42.343641 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:42.343623 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:12:42.346972 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:12:42.346956 2574 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:14:27.112426 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.112393 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j"] Apr 17 17:14:27.114868 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.114851 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.117405 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.117380 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:14:27.117512 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.117495 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 17 17:14:27.118597 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.118579 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-2bmt6\"" Apr 17 17:14:27.122885 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.122862 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j"] Apr 17 17:14:27.280226 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.280183 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwwj8\" (UniqueName: \"kubernetes.io/projected/6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d-kube-api-access-nwwj8\") pod \"openshift-lws-operator-bfc7f696d-xzn2j\" (UID: \"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.280405 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.280255 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d-tmp\") pod \"openshift-lws-operator-bfc7f696d-xzn2j\" (UID: \"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.381502 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.381411 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwwj8\" (UniqueName: \"kubernetes.io/projected/6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d-kube-api-access-nwwj8\") pod \"openshift-lws-operator-bfc7f696d-xzn2j\" (UID: \"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.381502 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.381473 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d-tmp\") pod \"openshift-lws-operator-bfc7f696d-xzn2j\" (UID: \"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.381839 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.381821 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d-tmp\") pod \"openshift-lws-operator-bfc7f696d-xzn2j\" (UID: \"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.390241 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.390217 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwwj8\" (UniqueName: \"kubernetes.io/projected/6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d-kube-api-access-nwwj8\") pod \"openshift-lws-operator-bfc7f696d-xzn2j\" (UID: \"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.434737 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.434691 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" Apr 17 17:14:27.556986 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.556928 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j"] Apr 17 17:14:27.563234 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:14:27.563205 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6599bffa_b1ac_4ac4_9a3b_bba78c8d9b9d.slice/crio-98cbc55045dcfcffc5dec4e0b3e4bf446729e865eeccd2cd79fb8e2030cd036c WatchSource:0}: Error finding container 98cbc55045dcfcffc5dec4e0b3e4bf446729e865eeccd2cd79fb8e2030cd036c: Status 404 returned error can't find the container with id 98cbc55045dcfcffc5dec4e0b3e4bf446729e865eeccd2cd79fb8e2030cd036c Apr 17 17:14:27.564790 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.564769 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:14:27.809584 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:27.809548 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" event={"ID":"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d","Type":"ContainerStarted","Data":"98cbc55045dcfcffc5dec4e0b3e4bf446729e865eeccd2cd79fb8e2030cd036c"} Apr 17 17:14:38.097547 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:14:38.097451 2574 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:e454f0147070fc62779834d1fe721eadc038d8aa746cd37ec328d6d60d485c78 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" image="registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309" Apr 17 17:14:38.097933 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:14:38.097672 2574 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:openshift-lws-operator,Image:registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309,Command:[lws-operator],Args:[operator],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAME,Value:openshift-lws-operator,ValueFrom:nil,},EnvVar{Name:RELATED_IMAGE_OPERAND_IMAGE,Value:registry.redhat.io/leader-worker-set/lws-rhel9@sha256:affb303b1173c273231bb50fef07310b0e220d2f08bfc0aa5912d0825e3e0d4f,ValueFrom:nil,},EnvVar{Name:OPERATOR_CONDITION_NAME,Value:leader-worker-set.v1.0.0,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tmp,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nwwj8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*true,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-lws-operator-bfc7f696d-xzn2j_openshift-lws-operator(6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d): ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:e454f0147070fc62779834d1fe721eadc038d8aa746cd37ec328d6d60d485c78 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image" logger="UnhandledError" Apr 17 17:14:38.098874 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:14:38.098838 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-lws-operator\" with ErrImagePull: \"unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:e454f0147070fc62779834d1fe721eadc038d8aa746cd37ec328d6d60d485c78 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" podUID="6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d" Apr 17 17:14:38.848965 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:14:38.848927 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-lws-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309\\\": ErrImagePull: unable to pull image or OCI artifact: pull image err: copying system image from manifest list: determining manifest MIME type for docker://registry.redhat.io/leader-worker-set/lws-rhel9-operator@sha256:c202bfa15626262ff22682b64ac57539d28dd35f5960c490f5afea75cef34309: reading manifest sha256:e454f0147070fc62779834d1fe721eadc038d8aa746cd37ec328d6d60d485c78 in registry.redhat.io/leader-worker-set/lws-rhel9-operator: received unexpected HTTP status: 504 Gateway Timeout; artifact err: provided artifact is a container image\"" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" podUID="6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d" Apr 17 17:14:51.891911 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:51.891870 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" event={"ID":"6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d","Type":"ContainerStarted","Data":"ac39b67e75b18f3a32417e57d4eeec9ab97800b4e8fadf975135512b95c12cbd"} Apr 17 17:14:51.908600 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:14:51.908547 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-xzn2j" podStartSLOduration=0.696547727 podStartE2EDuration="24.908530595s" podCreationTimestamp="2026-04-17 17:14:27 +0000 UTC" firstStartedPulling="2026-04-17 17:14:27.564956482 +0000 UTC m=+405.637543664" lastFinishedPulling="2026-04-17 17:14:51.77693936 +0000 UTC m=+429.849526532" observedRunningTime="2026-04-17 17:14:51.906912486 +0000 UTC m=+429.979499693" watchObservedRunningTime="2026-04-17 17:14:51.908530595 +0000 UTC m=+429.981117792" Apr 17 17:15:03.070931 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.070848 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn"] Apr 17 17:15:03.074655 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.074629 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.078582 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.078559 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-r66kf\"" Apr 17 17:15:03.078582 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.078575 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 17 17:15:03.078783 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.078575 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 17 17:15:03.078783 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.078655 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 17 17:15:03.083049 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.083025 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn"] Apr 17 17:15:03.193377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.193341 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8dfb7f4f-f640-4460-90b4-4e593f96f973-cert\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.193377 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.193377 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8dfb7f4f-f640-4460-90b4-4e593f96f973-metrics-cert\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.193581 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.193509 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsd4g\" (UniqueName: \"kubernetes.io/projected/8dfb7f4f-f640-4460-90b4-4e593f96f973-kube-api-access-zsd4g\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.193642 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.193574 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8dfb7f4f-f640-4460-90b4-4e593f96f973-manager-config\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.294075 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.294039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsd4g\" (UniqueName: \"kubernetes.io/projected/8dfb7f4f-f640-4460-90b4-4e593f96f973-kube-api-access-zsd4g\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.294252 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.294096 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8dfb7f4f-f640-4460-90b4-4e593f96f973-manager-config\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.294252 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.294135 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8dfb7f4f-f640-4460-90b4-4e593f96f973-cert\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.294252 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.294160 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8dfb7f4f-f640-4460-90b4-4e593f96f973-metrics-cert\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.294962 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.294935 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/8dfb7f4f-f640-4460-90b4-4e593f96f973-manager-config\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.296867 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.296842 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8dfb7f4f-f640-4460-90b4-4e593f96f973-cert\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.296969 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.296886 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/8dfb7f4f-f640-4460-90b4-4e593f96f973-metrics-cert\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.302797 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.302771 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsd4g\" (UniqueName: \"kubernetes.io/projected/8dfb7f4f-f640-4460-90b4-4e593f96f973-kube-api-access-zsd4g\") pod \"lws-controller-manager-5448568df4-p7tbn\" (UID: \"8dfb7f4f-f640-4460-90b4-4e593f96f973\") " pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.384880 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.384787 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:03.513922 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.513898 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn"] Apr 17 17:15:03.516045 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:15:03.516011 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dfb7f4f_f640_4460_90b4_4e593f96f973.slice/crio-d4414f726cf81d21eb98b1d2b86d97bdbd41ec4cc983e00732817452e19b57df WatchSource:0}: Error finding container d4414f726cf81d21eb98b1d2b86d97bdbd41ec4cc983e00732817452e19b57df: Status 404 returned error can't find the container with id d4414f726cf81d21eb98b1d2b86d97bdbd41ec4cc983e00732817452e19b57df Apr 17 17:15:03.928335 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:03.928299 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" event={"ID":"8dfb7f4f-f640-4460-90b4-4e593f96f973","Type":"ContainerStarted","Data":"d4414f726cf81d21eb98b1d2b86d97bdbd41ec4cc983e00732817452e19b57df"} Apr 17 17:15:06.939124 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:06.939081 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" event={"ID":"8dfb7f4f-f640-4460-90b4-4e593f96f973","Type":"ContainerStarted","Data":"257e27dd1b74153a5a9026f888fa487dd1e339e8651ec25a24110b1dbd503128"} Apr 17 17:15:06.939517 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:06.939223 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:06.965885 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:06.965813 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" podStartSLOduration=1.186280397 podStartE2EDuration="3.965789783s" podCreationTimestamp="2026-04-17 17:15:03 +0000 UTC" firstStartedPulling="2026-04-17 17:15:03.517873956 +0000 UTC m=+441.590461128" lastFinishedPulling="2026-04-17 17:15:06.297383328 +0000 UTC m=+444.369970514" observedRunningTime="2026-04-17 17:15:06.962813253 +0000 UTC m=+445.035400446" watchObservedRunningTime="2026-04-17 17:15:06.965789783 +0000 UTC m=+445.038377023" Apr 17 17:15:09.573751 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.573713 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx"] Apr 17 17:15:09.577452 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.577430 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.580653 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.580630 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 17 17:15:09.581112 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.581094 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 17 17:15:09.581318 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.581302 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-m57m4\"" Apr 17 17:15:09.581529 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.581509 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 17 17:15:09.581623 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.581590 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 17 17:15:09.608715 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.608681 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx"] Apr 17 17:15:09.655158 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.655122 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72c5c5ce-fef7-4e45-9974-fa71676312a2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.655336 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.655186 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72c5c5ce-fef7-4e45-9974-fa71676312a2-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.655336 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.655208 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nhb6\" (UniqueName: \"kubernetes.io/projected/72c5c5ce-fef7-4e45-9974-fa71676312a2-kube-api-access-4nhb6\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.756443 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.756386 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72c5c5ce-fef7-4e45-9974-fa71676312a2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.756683 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.756523 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72c5c5ce-fef7-4e45-9974-fa71676312a2-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.756917 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.756572 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nhb6\" (UniqueName: \"kubernetes.io/projected/72c5c5ce-fef7-4e45-9974-fa71676312a2-kube-api-access-4nhb6\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.760365 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.760330 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/72c5c5ce-fef7-4e45-9974-fa71676312a2-webhook-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.766561 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.766528 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/72c5c5ce-fef7-4e45-9974-fa71676312a2-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.767393 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.767369 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nhb6\" (UniqueName: \"kubernetes.io/projected/72c5c5ce-fef7-4e45-9974-fa71676312a2-kube-api-access-4nhb6\") pod \"opendatahub-operator-controller-manager-6569445fb5-wmwjx\" (UID: \"72c5c5ce-fef7-4e45-9974-fa71676312a2\") " pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:09.888326 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:09.888238 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:10.026218 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:10.026190 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx"] Apr 17 17:15:10.028972 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:15:10.028939 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72c5c5ce_fef7_4e45_9974_fa71676312a2.slice/crio-1c14522faabe790ab3ce0a3a12d76c3834f2f4b3b61c2fa4336622c843702289 WatchSource:0}: Error finding container 1c14522faabe790ab3ce0a3a12d76c3834f2f4b3b61c2fa4336622c843702289: Status 404 returned error can't find the container with id 1c14522faabe790ab3ce0a3a12d76c3834f2f4b3b61c2fa4336622c843702289 Apr 17 17:15:10.956508 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:10.956467 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" event={"ID":"72c5c5ce-fef7-4e45-9974-fa71676312a2","Type":"ContainerStarted","Data":"1c14522faabe790ab3ce0a3a12d76c3834f2f4b3b61c2fa4336622c843702289"} Apr 17 17:15:12.967207 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:12.967169 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" event={"ID":"72c5c5ce-fef7-4e45-9974-fa71676312a2","Type":"ContainerStarted","Data":"36ad3f2c5572dfc9a7d758146b04d4d51c0451057b71f5e2a4d83edcc65cf4b9"} Apr 17 17:15:12.967670 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:12.967279 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:12.990185 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:12.990134 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" podStartSLOduration=1.529898109 podStartE2EDuration="3.990117323s" podCreationTimestamp="2026-04-17 17:15:09 +0000 UTC" firstStartedPulling="2026-04-17 17:15:10.030829574 +0000 UTC m=+448.103416760" lastFinishedPulling="2026-04-17 17:15:12.491048798 +0000 UTC m=+450.563635974" observedRunningTime="2026-04-17 17:15:12.988035902 +0000 UTC m=+451.060623094" watchObservedRunningTime="2026-04-17 17:15:12.990117323 +0000 UTC m=+451.062704516" Apr 17 17:15:17.944860 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:17.944824 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5448568df4-p7tbn" Apr 17 17:15:23.972540 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:23.972505 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6569445fb5-wmwjx" Apr 17 17:15:28.312648 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.312592 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-db5457dbf-85xkn"] Apr 17 17:15:28.319768 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.319738 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.322546 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.322520 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:15:28.322702 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.322558 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 17 17:15:28.323658 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.323554 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:15:28.323797 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.323662 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-kp745\"" Apr 17 17:15:28.324014 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.323879 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 17 17:15:28.333050 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.333025 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-db5457dbf-85xkn"] Apr 17 17:15:28.422957 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.422918 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/808fa94c-d3e1-4c23-8e61-3597a88a1144-tmp\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.423127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.422980 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qm9vt\" (UniqueName: \"kubernetes.io/projected/808fa94c-d3e1-4c23-8e61-3597a88a1144-kube-api-access-qm9vt\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.423127 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.423034 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/808fa94c-d3e1-4c23-8e61-3597a88a1144-tls-certs\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.523701 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.523666 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qm9vt\" (UniqueName: \"kubernetes.io/projected/808fa94c-d3e1-4c23-8e61-3597a88a1144-kube-api-access-qm9vt\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.523882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.523752 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/808fa94c-d3e1-4c23-8e61-3597a88a1144-tls-certs\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.523882 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.523811 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/808fa94c-d3e1-4c23-8e61-3597a88a1144-tmp\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.526301 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.526273 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/808fa94c-d3e1-4c23-8e61-3597a88a1144-tmp\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.526520 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.526502 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/808fa94c-d3e1-4c23-8e61-3597a88a1144-tls-certs\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.533701 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.533679 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qm9vt\" (UniqueName: \"kubernetes.io/projected/808fa94c-d3e1-4c23-8e61-3597a88a1144-kube-api-access-qm9vt\") pod \"kube-auth-proxy-db5457dbf-85xkn\" (UID: \"808fa94c-d3e1-4c23-8e61-3597a88a1144\") " pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.636338 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.636240 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" Apr 17 17:15:28.759571 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:28.759546 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-db5457dbf-85xkn"] Apr 17 17:15:28.762002 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:15:28.761968 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod808fa94c_d3e1_4c23_8e61_3597a88a1144.slice/crio-957331835ac28878fc869294b26059de5f4cf40229847e290bb2d1e897ee0172 WatchSource:0}: Error finding container 957331835ac28878fc869294b26059de5f4cf40229847e290bb2d1e897ee0172: Status 404 returned error can't find the container with id 957331835ac28878fc869294b26059de5f4cf40229847e290bb2d1e897ee0172 Apr 17 17:15:29.025537 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:29.025502 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" event={"ID":"808fa94c-d3e1-4c23-8e61-3597a88a1144","Type":"ContainerStarted","Data":"957331835ac28878fc869294b26059de5f4cf40229847e290bb2d1e897ee0172"} Apr 17 17:15:32.038297 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:15:32.038262 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" event={"ID":"808fa94c-d3e1-4c23-8e61-3597a88a1144","Type":"ContainerStarted","Data":"c555752a3f618685b645d31e2b4beb9e92405fe2c1f9853336606df6496edf3d"} Apr 17 17:17:08.449699 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.448848 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-db5457dbf-85xkn" podStartSLOduration=97.315509656 podStartE2EDuration="1m40.448828942s" podCreationTimestamp="2026-04-17 17:15:28 +0000 UTC" firstStartedPulling="2026-04-17 17:15:28.76378786 +0000 UTC m=+466.836375038" lastFinishedPulling="2026-04-17 17:15:31.897107151 +0000 UTC m=+469.969694324" observedRunningTime="2026-04-17 17:15:32.061537098 +0000 UTC m=+470.134124291" watchObservedRunningTime="2026-04-17 17:17:08.448828942 +0000 UTC m=+566.521416138" Apr 17 17:17:08.449699 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.449004 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s"] Apr 17 17:17:08.456537 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.456505 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.459304 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.459279 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 17 17:17:08.459437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.459301 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 17 17:17:08.459437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.459287 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:17:08.460258 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.460243 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-8n2w7\"" Apr 17 17:17:08.460318 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.460274 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:17:08.464770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.464751 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s"] Apr 17 17:17:08.519623 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.519568 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dsft\" (UniqueName: \"kubernetes.io/projected/612828c4-17db-4d92-9807-9aa7b727ac00-kube-api-access-5dsft\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.519795 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.519753 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/612828c4-17db-4d92-9807-9aa7b727ac00-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.519856 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.519810 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/612828c4-17db-4d92-9807-9aa7b727ac00-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.620586 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.620542 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/612828c4-17db-4d92-9807-9aa7b727ac00-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.620776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.620621 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/612828c4-17db-4d92-9807-9aa7b727ac00-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.620776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.620662 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dsft\" (UniqueName: \"kubernetes.io/projected/612828c4-17db-4d92-9807-9aa7b727ac00-kube-api-access-5dsft\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.620776 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:17:08.620712 2574 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 17 17:17:08.620890 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:17:08.620806 2574 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/612828c4-17db-4d92-9807-9aa7b727ac00-plugin-serving-cert podName:612828c4-17db-4d92-9807-9aa7b727ac00 nodeName:}" failed. No retries permitted until 2026-04-17 17:17:09.120784154 +0000 UTC m=+567.193371335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/612828c4-17db-4d92-9807-9aa7b727ac00-plugin-serving-cert") pod "kuadrant-console-plugin-6cb54b5c86-kgw8s" (UID: "612828c4-17db-4d92-9807-9aa7b727ac00") : secret "plugin-serving-cert" not found Apr 17 17:17:08.621251 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.621234 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/612828c4-17db-4d92-9807-9aa7b727ac00-nginx-conf\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:08.629790 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:08.629762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dsft\" (UniqueName: \"kubernetes.io/projected/612828c4-17db-4d92-9807-9aa7b727ac00-kube-api-access-5dsft\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:09.125280 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:09.125241 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/612828c4-17db-4d92-9807-9aa7b727ac00-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:09.127790 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:09.127762 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/612828c4-17db-4d92-9807-9aa7b727ac00-plugin-serving-cert\") pod \"kuadrant-console-plugin-6cb54b5c86-kgw8s\" (UID: \"612828c4-17db-4d92-9807-9aa7b727ac00\") " pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:09.367084 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:09.367050 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" Apr 17 17:17:09.505021 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:09.504950 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s"] Apr 17 17:17:09.507187 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:17:09.507157 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod612828c4_17db_4d92_9807_9aa7b727ac00.slice/crio-8765532ce83544f1cab2f26d4cfcd01a2af4752affc02e9f0418a3b6a9cbeed6 WatchSource:0}: Error finding container 8765532ce83544f1cab2f26d4cfcd01a2af4752affc02e9f0418a3b6a9cbeed6: Status 404 returned error can't find the container with id 8765532ce83544f1cab2f26d4cfcd01a2af4752affc02e9f0418a3b6a9cbeed6 Apr 17 17:17:10.368982 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:10.368940 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" event={"ID":"612828c4-17db-4d92-9807-9aa7b727ac00","Type":"ContainerStarted","Data":"8765532ce83544f1cab2f26d4cfcd01a2af4752affc02e9f0418a3b6a9cbeed6"} Apr 17 17:17:34.472261 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:34.472220 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" event={"ID":"612828c4-17db-4d92-9807-9aa7b727ac00","Type":"ContainerStarted","Data":"9bfbf7725705894f15fa61cb66fb7abceb23aff1d2bfd1f0bf369b4891020999"} Apr 17 17:17:34.492409 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:34.492271 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6cb54b5c86-kgw8s" podStartSLOduration=1.976207904 podStartE2EDuration="26.492257806s" podCreationTimestamp="2026-04-17 17:17:08 +0000 UTC" firstStartedPulling="2026-04-17 17:17:09.508370726 +0000 UTC m=+567.580957898" lastFinishedPulling="2026-04-17 17:17:34.024420625 +0000 UTC m=+592.097007800" observedRunningTime="2026-04-17 17:17:34.491812771 +0000 UTC m=+592.564399975" watchObservedRunningTime="2026-04-17 17:17:34.492257806 +0000 UTC m=+592.564845000" Apr 17 17:17:42.364333 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:42.364305 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:17:42.365277 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:42.365257 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:17:42.369780 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:42.369762 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:17:42.370378 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:42.370363 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:17:55.760752 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.760718 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:17:55.764967 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.764934 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:55.768550 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.768528 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:17:55.774106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.774043 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:17:55.807174 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.807142 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:17:55.858793 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.858748 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8tsxf\" (UniqueName: \"kubernetes.io/projected/bdbc83fb-2b4c-451a-a8ce-36c2de936677-kube-api-access-8tsxf\") pod \"limitador-limitador-78c99df468-fjgsv\" (UID: \"bdbc83fb-2b4c-451a-a8ce-36c2de936677\") " pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:55.858962 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.858888 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bdbc83fb-2b4c-451a-a8ce-36c2de936677-config-file\") pod \"limitador-limitador-78c99df468-fjgsv\" (UID: \"bdbc83fb-2b4c-451a-a8ce-36c2de936677\") " pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:55.960326 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.960289 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bdbc83fb-2b4c-451a-a8ce-36c2de936677-config-file\") pod \"limitador-limitador-78c99df468-fjgsv\" (UID: \"bdbc83fb-2b4c-451a-a8ce-36c2de936677\") " pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:55.960492 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.960348 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8tsxf\" (UniqueName: \"kubernetes.io/projected/bdbc83fb-2b4c-451a-a8ce-36c2de936677-kube-api-access-8tsxf\") pod \"limitador-limitador-78c99df468-fjgsv\" (UID: \"bdbc83fb-2b4c-451a-a8ce-36c2de936677\") " pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:55.960916 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.960897 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/bdbc83fb-2b4c-451a-a8ce-36c2de936677-config-file\") pod \"limitador-limitador-78c99df468-fjgsv\" (UID: \"bdbc83fb-2b4c-451a-a8ce-36c2de936677\") " pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:55.968297 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:55.968275 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8tsxf\" (UniqueName: \"kubernetes.io/projected/bdbc83fb-2b4c-451a-a8ce-36c2de936677-kube-api-access-8tsxf\") pod \"limitador-limitador-78c99df468-fjgsv\" (UID: \"bdbc83fb-2b4c-451a-a8ce-36c2de936677\") " pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:56.077795 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:56.077706 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:56.204974 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:56.204820 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:17:56.207837 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:17:56.207803 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbdbc83fb_2b4c_451a_a8ce_36c2de936677.slice/crio-e96b067e8bf84fa985a556143a3d1ea8c884a60460be30ebaee8a814bf3070b0 WatchSource:0}: Error finding container e96b067e8bf84fa985a556143a3d1ea8c884a60460be30ebaee8a814bf3070b0: Status 404 returned error can't find the container with id e96b067e8bf84fa985a556143a3d1ea8c884a60460be30ebaee8a814bf3070b0 Apr 17 17:17:56.548064 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:56.548028 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" event={"ID":"bdbc83fb-2b4c-451a-a8ce-36c2de936677","Type":"ContainerStarted","Data":"e96b067e8bf84fa985a556143a3d1ea8c884a60460be30ebaee8a814bf3070b0"} Apr 17 17:17:59.564083 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:59.564049 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" event={"ID":"bdbc83fb-2b4c-451a-a8ce-36c2de936677","Type":"ContainerStarted","Data":"ff78d0e2a408485ff66dbd0c870cfe036b0598a6866736b145100ac19e0869b6"} Apr 17 17:17:59.564442 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:59.564137 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:17:59.584009 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:17:59.583957 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" podStartSLOduration=2.168303085 podStartE2EDuration="4.583942755s" podCreationTimestamp="2026-04-17 17:17:55 +0000 UTC" firstStartedPulling="2026-04-17 17:17:56.209651615 +0000 UTC m=+614.282238802" lastFinishedPulling="2026-04-17 17:17:58.625291299 +0000 UTC m=+616.697878472" observedRunningTime="2026-04-17 17:17:59.582726858 +0000 UTC m=+617.655314053" watchObservedRunningTime="2026-04-17 17:17:59.583942755 +0000 UTC m=+617.656529983" Apr 17 17:18:10.567974 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:18:10.567943 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-fjgsv" Apr 17 17:18:45.062271 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:18:45.062228 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:19:21.046286 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:21.046247 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:19:33.642888 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:33.642807 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:19:46.149928 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:46.149888 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:19:47.246331 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.246301 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4"] Apr 17 17:19:47.250052 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.250033 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.252818 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.252796 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-gnhc5\"" Apr 17 17:19:47.252940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.252819 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 17 17:19:47.253880 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.253865 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 17 17:19:47.253963 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.253943 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 17 17:19:47.258989 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.258965 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4"] Apr 17 17:19:47.385458 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.385428 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.385660 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.385485 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.385942 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.385912 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.389860 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.389818 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.390023 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.389902 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5998575a-bd82-45fd-9d19-a2c615fdfe67-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.390080 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.390027 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wfcfq\" (UniqueName: \"kubernetes.io/projected/5998575a-bd82-45fd-9d19-a2c615fdfe67-kube-api-access-wfcfq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491101 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wfcfq\" (UniqueName: \"kubernetes.io/projected/5998575a-bd82-45fd-9d19-a2c615fdfe67-kube-api-access-wfcfq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491154 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491156 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491401 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491180 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491401 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491196 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491401 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491226 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491401 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491259 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5998575a-bd82-45fd-9d19-a2c615fdfe67-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491708 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491654 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491708 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491667 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.491905 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.491743 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.493685 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.493658 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5998575a-bd82-45fd-9d19-a2c615fdfe67-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.493822 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.493806 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5998575a-bd82-45fd-9d19-a2c615fdfe67-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.499183 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.499130 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wfcfq\" (UniqueName: \"kubernetes.io/projected/5998575a-bd82-45fd-9d19-a2c615fdfe67-kube-api-access-wfcfq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4\" (UID: \"5998575a-bd82-45fd-9d19-a2c615fdfe67\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.561405 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.561369 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:19:47.686910 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.686756 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4"] Apr 17 17:19:47.689742 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:19:47.689707 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5998575a_bd82_45fd_9d19_a2c615fdfe67.slice/crio-54a178732242851b3cd2554b3213e31b81c10fd0473ab41e4239d9fade91f793 WatchSource:0}: Error finding container 54a178732242851b3cd2554b3213e31b81c10fd0473ab41e4239d9fade91f793: Status 404 returned error can't find the container with id 54a178732242851b3cd2554b3213e31b81c10fd0473ab41e4239d9fade91f793 Apr 17 17:19:47.691453 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.691434 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:19:47.937242 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:47.937206 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" event={"ID":"5998575a-bd82-45fd-9d19-a2c615fdfe67","Type":"ContainerStarted","Data":"54a178732242851b3cd2554b3213e31b81c10fd0473ab41e4239d9fade91f793"} Apr 17 17:19:51.046788 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:51.046752 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:19:52.960594 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:52.960551 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" event={"ID":"5998575a-bd82-45fd-9d19-a2c615fdfe67","Type":"ContainerStarted","Data":"cf2658fb8b2a232251fa0f9caaba5c85b61194dd188e050b997b3269d6e9c1f2"} Apr 17 17:19:57.243820 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:57.243783 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:19:58.982669 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:58.982635 2574 generic.go:358] "Generic (PLEG): container finished" podID="5998575a-bd82-45fd-9d19-a2c615fdfe67" containerID="cf2658fb8b2a232251fa0f9caaba5c85b61194dd188e050b997b3269d6e9c1f2" exitCode=0 Apr 17 17:19:58.983043 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:19:58.982642 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" event={"ID":"5998575a-bd82-45fd-9d19-a2c615fdfe67","Type":"ContainerDied","Data":"cf2658fb8b2a232251fa0f9caaba5c85b61194dd188e050b997b3269d6e9c1f2"} Apr 17 17:20:00.994766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:20:00.994729 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" event={"ID":"5998575a-bd82-45fd-9d19-a2c615fdfe67","Type":"ContainerStarted","Data":"b7588959f0bf43d81321a63fc38cc7bbd6b353b8c4520df47774e42be64ecc32"} Apr 17 17:20:00.995155 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:20:00.994959 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:20:01.013826 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:20:01.013781 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" podStartSLOduration=1.652386965 podStartE2EDuration="14.013767414s" podCreationTimestamp="2026-04-17 17:19:47 +0000 UTC" firstStartedPulling="2026-04-17 17:19:47.691588082 +0000 UTC m=+725.764175255" lastFinishedPulling="2026-04-17 17:20:00.052968529 +0000 UTC m=+738.125555704" observedRunningTime="2026-04-17 17:20:01.011445103 +0000 UTC m=+739.084032321" watchObservedRunningTime="2026-04-17 17:20:01.013767414 +0000 UTC m=+739.086354586" Apr 17 17:20:06.954917 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:20:06.954878 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:20:12.010835 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:20:12.010802 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4" Apr 17 17:20:52.266469 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:20:52.266437 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:21:02.450467 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:21:02.450391 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:21:11.643874 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:21:11.643840 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:21:21.654330 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:21:21.654293 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:21:31.347534 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:21:31.347492 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:21:41.277666 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:21:41.277623 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:22:42.390206 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:22:42.390177 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:22:42.391789 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:22:42.391766 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:22:42.395195 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:22:42.395177 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:22:42.396778 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:22:42.396761 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:22:44.345046 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:22:44.345006 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:22:59.151750 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:22:59.151713 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:23:38.139050 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:23:38.139012 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:23:54.646170 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:23:54.646096 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:24:10.536062 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:24:10.536025 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:24:25.636101 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:24:25.636071 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:25:16.641041 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:25:16.641010 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:25:26.144550 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:25:26.144469 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:25:42.044461 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:25:42.044419 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:25:51.843094 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:25:51.843058 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:26:07.547251 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:26:07.547210 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:26:16.539763 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:26:16.539726 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:26:49.736049 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:26:49.736016 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:26:57.336462 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:26:57.336378 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:27:06.342216 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:06.342180 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:27:14.542751 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:14.542722 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:27:22.935437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:22.935401 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:27:39.845386 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:39.845350 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:27:42.419686 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:42.419659 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:27:42.422103 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:42.422076 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:27:42.425263 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:42.425238 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:27:42.427191 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:42.427162 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:27:53.049731 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:27:53.049696 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:28:40.144400 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:28:40.144366 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:28:48.246199 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:28:48.246161 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:28:57.145898 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:28:57.145862 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:05.643970 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:05.643936 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:14.739509 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:14.739468 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:22.846406 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:22.846369 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:31.650040 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:31.650003 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:40.438701 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:40.438659 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:49.442567 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:49.442533 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:29:58.037842 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:29:58.037758 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:00.172000 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.171969 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29607450-n5tz8"] Apr 17 17:30:00.176270 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.176236 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:30:00.179908 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.179884 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7k4fn\"" Apr 17 17:30:00.192493 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.192470 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607450-n5tz8"] Apr 17 17:30:00.305521 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.305489 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5cxz\" (UniqueName: \"kubernetes.io/projected/5df88f35-2f26-459a-89fd-5ec651130d99-kube-api-access-g5cxz\") pod \"maas-api-key-cleanup-29607450-n5tz8\" (UID: \"5df88f35-2f26-459a-89fd-5ec651130d99\") " pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:30:00.406400 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.406368 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g5cxz\" (UniqueName: \"kubernetes.io/projected/5df88f35-2f26-459a-89fd-5ec651130d99-kube-api-access-g5cxz\") pod \"maas-api-key-cleanup-29607450-n5tz8\" (UID: \"5df88f35-2f26-459a-89fd-5ec651130d99\") " pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:30:00.414787 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.414756 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5cxz\" (UniqueName: \"kubernetes.io/projected/5df88f35-2f26-459a-89fd-5ec651130d99-kube-api-access-g5cxz\") pod \"maas-api-key-cleanup-29607450-n5tz8\" (UID: \"5df88f35-2f26-459a-89fd-5ec651130d99\") " pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:30:00.488715 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.488633 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:30:00.609952 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.609929 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607450-n5tz8"] Apr 17 17:30:00.612256 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:30:00.612227 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5df88f35_2f26_459a_89fd_5ec651130d99.slice/crio-ee0dd2a9d39776f720a1824fc969363355f858b1818efe629755e9c5a5204cf2 WatchSource:0}: Error finding container ee0dd2a9d39776f720a1824fc969363355f858b1818efe629755e9c5a5204cf2: Status 404 returned error can't find the container with id ee0dd2a9d39776f720a1824fc969363355f858b1818efe629755e9c5a5204cf2 Apr 17 17:30:00.614318 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:00.614304 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:30:01.076664 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:01.076628 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" event={"ID":"5df88f35-2f26-459a-89fd-5ec651130d99","Type":"ContainerStarted","Data":"ee0dd2a9d39776f720a1824fc969363355f858b1818efe629755e9c5a5204cf2"} Apr 17 17:30:04.091145 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:04.091110 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" event={"ID":"5df88f35-2f26-459a-89fd-5ec651130d99","Type":"ContainerStarted","Data":"5699d52749ddf5d116dabca6d8df5531f968cead0aff8bba43af9a2d52baf50a"} Apr 17 17:30:04.118146 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:04.118054 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" podStartSLOduration=0.876240917 podStartE2EDuration="4.118041263s" podCreationTimestamp="2026-04-17 17:30:00 +0000 UTC" firstStartedPulling="2026-04-17 17:30:00.614424279 +0000 UTC m=+1338.687011452" lastFinishedPulling="2026-04-17 17:30:03.856224607 +0000 UTC m=+1341.928811798" observedRunningTime="2026-04-17 17:30:04.116165173 +0000 UTC m=+1342.188752370" watchObservedRunningTime="2026-04-17 17:30:04.118041263 +0000 UTC m=+1342.190628457" Apr 17 17:30:07.249197 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:07.249165 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:15.642248 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:15.642211 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:24.643681 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:24.643641 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:25.166700 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:25.166660 2574 generic.go:358] "Generic (PLEG): container finished" podID="5df88f35-2f26-459a-89fd-5ec651130d99" containerID="5699d52749ddf5d116dabca6d8df5531f968cead0aff8bba43af9a2d52baf50a" exitCode=6 Apr 17 17:30:25.166873 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:25.166739 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" event={"ID":"5df88f35-2f26-459a-89fd-5ec651130d99","Type":"ContainerDied","Data":"5699d52749ddf5d116dabca6d8df5531f968cead0aff8bba43af9a2d52baf50a"} Apr 17 17:30:25.167118 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:25.167097 2574 scope.go:117] "RemoveContainer" containerID="5699d52749ddf5d116dabca6d8df5531f968cead0aff8bba43af9a2d52baf50a" Apr 17 17:30:26.171781 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:26.171748 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" event={"ID":"5df88f35-2f26-459a-89fd-5ec651130d99","Type":"ContainerStarted","Data":"6c89ba62929c5f66d7cd3f5993ceea8a2726cbe800c9e1f9f5f1e9857c3762f9"} Apr 17 17:30:32.439036 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:32.439002 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:41.442622 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:41.442584 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:46.241940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:46.241909 2574 generic.go:358] "Generic (PLEG): container finished" podID="5df88f35-2f26-459a-89fd-5ec651130d99" containerID="6c89ba62929c5f66d7cd3f5993ceea8a2726cbe800c9e1f9f5f1e9857c3762f9" exitCode=6 Apr 17 17:30:46.241940 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:46.241948 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" event={"ID":"5df88f35-2f26-459a-89fd-5ec651130d99","Type":"ContainerDied","Data":"6c89ba62929c5f66d7cd3f5993ceea8a2726cbe800c9e1f9f5f1e9857c3762f9"} Apr 17 17:30:46.242419 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:46.241980 2574 scope.go:117] "RemoveContainer" containerID="5699d52749ddf5d116dabca6d8df5531f968cead0aff8bba43af9a2d52baf50a" Apr 17 17:30:46.242472 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:46.242437 2574 scope.go:117] "RemoveContainer" containerID="6c89ba62929c5f66d7cd3f5993ceea8a2726cbe800c9e1f9f5f1e9857c3762f9" Apr 17 17:30:46.242732 ip-10-0-132-98 kubenswrapper[2574]: E0417 17:30:46.242709 2574 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29607450-n5tz8_opendatahub(5df88f35-2f26-459a-89fd-5ec651130d99)\"" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" Apr 17 17:30:50.244939 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:50.244901 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:30:59.535935 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:30:59.535898 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:31:00.010598 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.010561 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607450-n5tz8"] Apr 17 17:31:00.132648 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.132621 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:31:00.213066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.213037 2574 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5cxz\" (UniqueName: \"kubernetes.io/projected/5df88f35-2f26-459a-89fd-5ec651130d99-kube-api-access-g5cxz\") pod \"5df88f35-2f26-459a-89fd-5ec651130d99\" (UID: \"5df88f35-2f26-459a-89fd-5ec651130d99\") " Apr 17 17:31:00.215209 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.215184 2574 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5df88f35-2f26-459a-89fd-5ec651130d99-kube-api-access-g5cxz" (OuterVolumeSpecName: "kube-api-access-g5cxz") pod "5df88f35-2f26-459a-89fd-5ec651130d99" (UID: "5df88f35-2f26-459a-89fd-5ec651130d99"). InnerVolumeSpecName "kube-api-access-g5cxz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:31:00.290086 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.290010 2574 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" Apr 17 17:31:00.290086 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.290022 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29607450-n5tz8" event={"ID":"5df88f35-2f26-459a-89fd-5ec651130d99","Type":"ContainerDied","Data":"ee0dd2a9d39776f720a1824fc969363355f858b1818efe629755e9c5a5204cf2"} Apr 17 17:31:00.290086 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.290061 2574 scope.go:117] "RemoveContainer" containerID="6c89ba62929c5f66d7cd3f5993ceea8a2726cbe800c9e1f9f5f1e9857c3762f9" Apr 17 17:31:00.311053 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.311020 2574 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607450-n5tz8"] Apr 17 17:31:00.314066 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.314040 2574 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g5cxz\" (UniqueName: \"kubernetes.io/projected/5df88f35-2f26-459a-89fd-5ec651130d99-kube-api-access-g5cxz\") on node \"ip-10-0-132-98.ec2.internal\" DevicePath \"\"" Apr 17 17:31:00.314459 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.314441 2574 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29607450-n5tz8"] Apr 17 17:31:00.439908 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:00.439875 2574 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" path="/var/lib/kubelet/pods/5df88f35-2f26-459a-89fd-5ec651130d99/volumes" Apr 17 17:31:07.243442 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:31:07.243406 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:32:42.453708 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:32:42.453681 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:32:42.454934 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:32:42.454918 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:32:42.459596 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:32:42.459577 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:32:42.460292 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:32:42.460273 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:33:25.743220 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:33:25.743138 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:33:30.836821 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:33:30.836789 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:33:55.840348 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:33:55.840314 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:34:02.649694 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:34:02.649651 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:34:11.639770 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:34:11.639731 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:34:22.051472 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:34:22.051438 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:34:30.845048 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:34:30.844967 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:34:41.541819 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:34:41.541781 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:34:50.774838 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:34:50.774801 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:35:00.546663 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:35:00.546623 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:35:09.740643 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:35:09.740589 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:35:20.045171 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:35:20.045139 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:35:29.036468 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:35:29.036432 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:36:01.848340 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:36:01.848259 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:36:44.754369 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:36:44.754326 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:36:52.749664 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:36:52.749626 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:02.246895 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:02.246858 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:10.337759 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:10.337724 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:19.541853 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:19.541812 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:32.047830 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:32.047793 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:41.442896 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:41.442851 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:42.480416 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:42.480384 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:37:42.482153 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:42.482130 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:37:42.485636 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:42.485580 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:37:42.487338 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:42.487319 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:37:46.984437 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:46.984394 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:37:57.341572 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:37:57.341538 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:38:05.744744 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:38:05.744707 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:38:13.451921 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:38:13.451886 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:38:23.845991 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:38:23.845954 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:38:42.149009 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:38:42.148976 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:38:50.036673 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:38:50.036639 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:38:58.950951 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:38:58.950914 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:39:07.737521 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:39:07.737485 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:39:23.642537 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:39:23.642505 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:39:33.251627 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:39:33.251570 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:39:41.745026 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:39:41.744987 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:39:49.950674 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:39:49.950637 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:39:59.341802 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:39:59.341769 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:40:07.443380 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:40:07.443341 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:40:16.742285 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:40:16.742244 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:40:29.742270 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:40:29.742192 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:40:38.547261 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:40:38.547229 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:40:51.642074 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:40:51.642035 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:00.143565 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:00.143524 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:09.434228 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:09.434192 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:16.844752 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:16.844718 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:24.053047 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:24.053006 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:41.345501 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:41.345470 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:50.746412 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:50.746379 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:41:59.742754 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:41:59.742677 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:42:07.564851 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:07.564816 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:42:31.946293 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:31.946258 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:42:42.508743 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:42.508710 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:42:42.511894 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:42.511869 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:42:42.514085 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:42.514052 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:42:42.517050 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:42.517035 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:42:43.242900 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:43.242867 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-fjgsv"] Apr 17 17:42:49.760192 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:49.760163 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6569445fb5-wmwjx_72c5c5ce-fef7-4e45-9974-fa71676312a2/manager/0.log" Apr 17 17:42:51.440929 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:51.440901 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-kgw8s_612828c4-17db-4d92-9807-9aa7b727ac00/kuadrant-console-plugin/0.log" Apr 17 17:42:51.790387 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:51.790312 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-fjgsv_bdbc83fb-2b4c-451a-a8ce-36c2de936677/limitador/0.log" Apr 17 17:42:52.476853 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:52.476823 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-db5457dbf-85xkn_808fa94c-d3e1-4c23-8e61-3597a88a1144/kube-auth-proxy/0.log" Apr 17 17:42:53.501873 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:53.501847 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4_5998575a-bd82-45fd-9d19-a2c615fdfe67/storage-initializer/0.log" Apr 17 17:42:53.509277 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:53.509251 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccb7nv4_5998575a-bd82-45fd-9d19-a2c615fdfe67/main/0.log" Apr 17 17:42:57.232949 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.232907 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ssms/must-gather-c87fp"] Apr 17 17:42:57.233521 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.233499 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" containerName="cleanup" Apr 17 17:42:57.233572 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.233527 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" containerName="cleanup" Apr 17 17:42:57.233654 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.233643 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" containerName="cleanup" Apr 17 17:42:57.233694 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.233659 2574 memory_manager.go:356] "RemoveStaleState removing state" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" containerName="cleanup" Apr 17 17:42:57.233739 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.233729 2574 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" containerName="cleanup" Apr 17 17:42:57.233739 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.233739 2574 state_mem.go:107] "Deleted CPUSet assignment" podUID="5df88f35-2f26-459a-89fd-5ec651130d99" containerName="cleanup" Apr 17 17:42:57.236838 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.236822 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.241405 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.241384 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ssms\"/\"openshift-service-ca.crt\"" Apr 17 17:42:57.241520 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.241423 2574 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2ssms\"/\"kube-root-ca.crt\"" Apr 17 17:42:57.241782 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.241768 2574 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2ssms\"/\"default-dockercfg-l4kdb\"" Apr 17 17:42:57.264368 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.264344 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/must-gather-c87fp"] Apr 17 17:42:57.298106 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.298072 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c8e9c13-1e52-4df3-a548-519ddaf21fd3-must-gather-output\") pod \"must-gather-c87fp\" (UID: \"3c8e9c13-1e52-4df3-a548-519ddaf21fd3\") " pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.298225 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.298109 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb2cs\" (UniqueName: \"kubernetes.io/projected/3c8e9c13-1e52-4df3-a548-519ddaf21fd3-kube-api-access-pb2cs\") pod \"must-gather-c87fp\" (UID: \"3c8e9c13-1e52-4df3-a548-519ddaf21fd3\") " pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.398776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.398744 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c8e9c13-1e52-4df3-a548-519ddaf21fd3-must-gather-output\") pod \"must-gather-c87fp\" (UID: \"3c8e9c13-1e52-4df3-a548-519ddaf21fd3\") " pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.398776 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.398779 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pb2cs\" (UniqueName: \"kubernetes.io/projected/3c8e9c13-1e52-4df3-a548-519ddaf21fd3-kube-api-access-pb2cs\") pod \"must-gather-c87fp\" (UID: \"3c8e9c13-1e52-4df3-a548-519ddaf21fd3\") " pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.399108 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.399088 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c8e9c13-1e52-4df3-a548-519ddaf21fd3-must-gather-output\") pod \"must-gather-c87fp\" (UID: \"3c8e9c13-1e52-4df3-a548-519ddaf21fd3\") " pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.409197 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.409174 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb2cs\" (UniqueName: \"kubernetes.io/projected/3c8e9c13-1e52-4df3-a548-519ddaf21fd3-kube-api-access-pb2cs\") pod \"must-gather-c87fp\" (UID: \"3c8e9c13-1e52-4df3-a548-519ddaf21fd3\") " pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.545431 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.545357 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/must-gather-c87fp" Apr 17 17:42:57.674709 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.674684 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/must-gather-c87fp"] Apr 17 17:42:57.677442 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:42:57.677416 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c8e9c13_1e52_4df3_a548_519ddaf21fd3.slice/crio-dd527d9a59bdc944547d63627f431ee64a60055d2a99e54a04ca36443edf6e1d WatchSource:0}: Error finding container dd527d9a59bdc944547d63627f431ee64a60055d2a99e54a04ca36443edf6e1d: Status 404 returned error can't find the container with id dd527d9a59bdc944547d63627f431ee64a60055d2a99e54a04ca36443edf6e1d Apr 17 17:42:57.679151 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.679130 2574 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:42:57.799163 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:57.799078 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/must-gather-c87fp" event={"ID":"3c8e9c13-1e52-4df3-a548-519ddaf21fd3","Type":"ContainerStarted","Data":"dd527d9a59bdc944547d63627f431ee64a60055d2a99e54a04ca36443edf6e1d"} Apr 17 17:42:58.812846 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:58.810526 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/must-gather-c87fp" event={"ID":"3c8e9c13-1e52-4df3-a548-519ddaf21fd3","Type":"ContainerStarted","Data":"071a064fd53fbbfbe96b80e64e1dabae218c1a66859983ecb6b6c653f0005dcc"} Apr 17 17:42:58.812846 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:58.810568 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/must-gather-c87fp" event={"ID":"3c8e9c13-1e52-4df3-a548-519ddaf21fd3","Type":"ContainerStarted","Data":"7a8567b9f3a097afbfe5a80802faead2e2370cce7f7af3103431a2d0ea7228cc"} Apr 17 17:42:58.833887 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:42:58.833066 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ssms/must-gather-c87fp" podStartSLOduration=1.029750611 podStartE2EDuration="1.833044759s" podCreationTimestamp="2026-04-17 17:42:57 +0000 UTC" firstStartedPulling="2026-04-17 17:42:57.679257555 +0000 UTC m=+2115.751844741" lastFinishedPulling="2026-04-17 17:42:58.482551717 +0000 UTC m=+2116.555138889" observedRunningTime="2026-04-17 17:42:58.829862131 +0000 UTC m=+2116.902449337" watchObservedRunningTime="2026-04-17 17:42:58.833044759 +0000 UTC m=+2116.905631954" Apr 17 17:43:00.071926 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:00.071891 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-mqsj4_3f7f5ebc-482a-4d47-b381-626eaf721f89/global-pull-secret-syncer/0.log" Apr 17 17:43:00.151198 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:00.151172 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-hrljp_3be1e497-a7c7-4b5d-be1e-1fd8df6acc62/konnectivity-agent/0.log" Apr 17 17:43:00.237094 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:00.237066 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-98.ec2.internal_1953ee34347c78a8ab12e5fc6254beb6/haproxy/0.log" Apr 17 17:43:04.505627 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:04.505532 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6cb54b5c86-kgw8s_612828c4-17db-4d92-9807-9aa7b727ac00/kuadrant-console-plugin/0.log" Apr 17 17:43:04.666331 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:04.666299 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-fjgsv_bdbc83fb-2b4c-451a-a8ce-36c2de936677/limitador/0.log" Apr 17 17:43:06.173584 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.173545 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/alertmanager/0.log" Apr 17 17:43:06.201797 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.201767 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/config-reloader/0.log" Apr 17 17:43:06.232361 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.232333 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/kube-rbac-proxy-web/0.log" Apr 17 17:43:06.256531 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.256405 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/kube-rbac-proxy/0.log" Apr 17 17:43:06.292996 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.292974 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/kube-rbac-proxy-metric/0.log" Apr 17 17:43:06.317766 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.317730 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/prom-label-proxy/0.log" Apr 17 17:43:06.339997 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.339947 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_03c6c110-7db0-4d40-950f-63e925f382eb/init-config-reloader/0.log" Apr 17 17:43:06.418662 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.418631 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rldj4_11ea47a9-7c97-4dce-8a6c-04d5bd7b930b/kube-state-metrics/0.log" Apr 17 17:43:06.438950 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.438921 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rldj4_11ea47a9-7c97-4dce-8a6c-04d5bd7b930b/kube-rbac-proxy-main/0.log" Apr 17 17:43:06.462097 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.462070 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-rldj4_11ea47a9-7c97-4dce-8a6c-04d5bd7b930b/kube-rbac-proxy-self/0.log" Apr 17 17:43:06.498226 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.498201 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6c68877994-q8bps_4465ec50-e774-47fa-a778-c5e02c74b6ec/metrics-server/0.log" Apr 17 17:43:06.523002 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.522933 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-sccc2_5dad46c4-fb79-4503-9293-2ff17aa1410d/monitoring-plugin/0.log" Apr 17 17:43:06.552134 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.552103 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgfvj_9384a44b-c862-49d8-8220-6aa55e030cd5/node-exporter/0.log" Apr 17 17:43:06.574420 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.574371 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgfvj_9384a44b-c862-49d8-8220-6aa55e030cd5/kube-rbac-proxy/0.log" Apr 17 17:43:06.596321 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.596292 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-cgfvj_9384a44b-c862-49d8-8220-6aa55e030cd5/init-textfile/0.log" Apr 17 17:43:06.750240 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.750204 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-rcls2_117d903d-3f00-4f06-9bf0-fac54cc8d1c4/kube-rbac-proxy-main/0.log" Apr 17 17:43:06.773945 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.773863 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-rcls2_117d903d-3f00-4f06-9bf0-fac54cc8d1c4/kube-rbac-proxy-self/0.log" Apr 17 17:43:06.795397 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.795362 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-rcls2_117d903d-3f00-4f06-9bf0-fac54cc8d1c4/openshift-state-metrics/0.log" Apr 17 17:43:06.833578 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.833550 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/prometheus/0.log" Apr 17 17:43:06.852813 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.852787 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/config-reloader/0.log" Apr 17 17:43:06.883859 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.883831 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/thanos-sidecar/0.log" Apr 17 17:43:06.904128 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.904101 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/kube-rbac-proxy-web/0.log" Apr 17 17:43:06.924035 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.923991 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/kube-rbac-proxy/0.log" Apr 17 17:43:06.951595 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.951561 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/kube-rbac-proxy-thanos/0.log" Apr 17 17:43:06.973254 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:06.973226 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_996f172c-64c2-4d8e-a0da-d47aa9247ebf/init-config-reloader/0.log" Apr 17 17:43:07.084374 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.084301 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-748749ff56-4pjhb_47318264-9cd9-4f06-b7d4-17db7406977f/telemeter-client/0.log" Apr 17 17:43:07.110370 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.110344 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-748749ff56-4pjhb_47318264-9cd9-4f06-b7d4-17db7406977f/reload/0.log" Apr 17 17:43:07.130884 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.130850 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-748749ff56-4pjhb_47318264-9cd9-4f06-b7d4-17db7406977f/kube-rbac-proxy/0.log" Apr 17 17:43:07.161307 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.161267 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85d87b4479-nqbnz_d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd/thanos-query/0.log" Apr 17 17:43:07.181866 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.181832 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85d87b4479-nqbnz_d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd/kube-rbac-proxy-web/0.log" Apr 17 17:43:07.201518 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.201494 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85d87b4479-nqbnz_d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd/kube-rbac-proxy/0.log" Apr 17 17:43:07.222434 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.222394 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85d87b4479-nqbnz_d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd/prom-label-proxy/0.log" Apr 17 17:43:07.244839 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.244811 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85d87b4479-nqbnz_d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd/kube-rbac-proxy-rules/0.log" Apr 17 17:43:07.266471 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:07.266445 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-85d87b4479-nqbnz_d8d765b1-f40e-4d3e-8cc1-a854f9ad75cd/kube-rbac-proxy-metrics/0.log" Apr 17 17:43:08.186507 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.186477 2574 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb"] Apr 17 17:43:08.192332 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.192308 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.198128 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.198101 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb"] Apr 17 17:43:08.314823 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.314785 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-sys\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.315009 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.314953 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5km7f\" (UniqueName: \"kubernetes.io/projected/976118de-a0a9-4cb5-94bf-25d18e9b13ac-kube-api-access-5km7f\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.315073 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.315015 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-lib-modules\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.315073 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.315047 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-proc\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.315166 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.315074 2574 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-podres\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.415799 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.415760 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-sys\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.415971 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.415861 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5km7f\" (UniqueName: \"kubernetes.io/projected/976118de-a0a9-4cb5-94bf-25d18e9b13ac-kube-api-access-5km7f\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.415971 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.415896 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-sys\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.416087 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.415900 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-lib-modules\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.416087 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.416014 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-proc\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.416087 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.416039 2574 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-podres\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.416202 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.416107 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-lib-modules\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.416202 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.416149 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-proc\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.416202 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.416168 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/976118de-a0a9-4cb5-94bf-25d18e9b13ac-podres\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.429532 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.429501 2574 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5km7f\" (UniqueName: \"kubernetes.io/projected/976118de-a0a9-4cb5-94bf-25d18e9b13ac-kube-api-access-5km7f\") pod \"perf-node-gather-daemonset-2hmdb\" (UID: \"976118de-a0a9-4cb5-94bf-25d18e9b13ac\") " pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.508086 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.508013 2574 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.672744 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.672712 2574 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb"] Apr 17 17:43:08.676490 ip-10-0-132-98 kubenswrapper[2574]: W0417 17:43:08.676430 2574 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod976118de_a0a9_4cb5_94bf_25d18e9b13ac.slice/crio-4da324ddec697a6fa1af6994f4760385b6981339cb3b1834e5aa451ffd09f89b WatchSource:0}: Error finding container 4da324ddec697a6fa1af6994f4760385b6981339cb3b1834e5aa451ffd09f89b: Status 404 returned error can't find the container with id 4da324ddec697a6fa1af6994f4760385b6981339cb3b1834e5aa451ffd09f89b Apr 17 17:43:08.797721 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.797643 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/1.log" Apr 17 17:43:08.803315 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.803290 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-kvtg4_81fbead9-b55d-41ce-9182-c35ba0bc0eb2/console-operator/2.log" Apr 17 17:43:08.858909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.858043 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" event={"ID":"976118de-a0a9-4cb5-94bf-25d18e9b13ac","Type":"ContainerStarted","Data":"0b956997ed20ad10a97524a01be2abd7b2568766a973ab0e851e163065760a62"} Apr 17 17:43:08.858909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.858089 2574 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" event={"ID":"976118de-a0a9-4cb5-94bf-25d18e9b13ac","Type":"ContainerStarted","Data":"4da324ddec697a6fa1af6994f4760385b6981339cb3b1834e5aa451ffd09f89b"} Apr 17 17:43:08.858909 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.858866 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:08.883012 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:08.882951 2574 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" podStartSLOduration=0.882930524 podStartE2EDuration="882.930524ms" podCreationTimestamp="2026-04-17 17:43:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:43:08.879710699 +0000 UTC m=+2126.952297894" watchObservedRunningTime="2026-04-17 17:43:08.882930524 +0000 UTC m=+2126.955517719" Apr 17 17:43:09.286131 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:09.286097 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-6mgfv_998f90ca-500c-4933-b022-c7fdd401c2e3/download-server/0.log" Apr 17 17:43:10.550102 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:10.550053 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m492r_5ec73ca8-f486-4ad3-b44e-4e1b5815d3de/dns/0.log" Apr 17 17:43:10.570236 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:10.570211 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-m492r_5ec73ca8-f486-4ad3-b44e-4e1b5815d3de/kube-rbac-proxy/0.log" Apr 17 17:43:10.618303 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:10.618278 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-52mt7_b5a9dd27-c914-41cc-88fc-5a64c1169c04/dns-node-resolver/0.log" Apr 17 17:43:11.108220 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:11.108188 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-2f5xx_36e5ce0d-a66e-4ad4-bf76-a0b0c1e1890b/node-ca/0.log" Apr 17 17:43:12.149522 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:12.149492 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-db5457dbf-85xkn_808fa94c-d3e1-4c23-8e61-3597a88a1144/kube-auth-proxy/0.log" Apr 17 17:43:12.781847 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:12.781821 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-n4m6l_05b2d8d3-5d00-41a8-97ed-8d4aa2d6503b/serve-healthcheck-canary/0.log" Apr 17 17:43:13.230452 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:13.230422 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8hcsn_bfd868a7-0269-499a-9b8c-4c2c8d2aba93/insights-operator/0.log" Apr 17 17:43:13.231618 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:13.231574 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-8hcsn_bfd868a7-0269-499a-9b8c-4c2c8d2aba93/insights-operator/1.log" Apr 17 17:43:13.313989 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:13.313963 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s6zkr_d78bb467-37d0-4129-b59e-e8587a2b8ca8/kube-rbac-proxy/0.log" Apr 17 17:43:13.332929 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:13.332905 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s6zkr_d78bb467-37d0-4129-b59e-e8587a2b8ca8/exporter/0.log" Apr 17 17:43:13.354683 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:13.354663 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-s6zkr_d78bb467-37d0-4129-b59e-e8587a2b8ca8/extractor/0.log" Apr 17 17:43:15.582539 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:15.582507 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6569445fb5-wmwjx_72c5c5ce-fef7-4e45-9974-fa71676312a2/manager/0.log" Apr 17 17:43:15.876526 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:15.876445 2574 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2ssms/perf-node-gather-daemonset-2hmdb" Apr 17 17:43:16.772804 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:16.772769 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5448568df4-p7tbn_8dfb7f4f-f640-4460-90b4-4e593f96f973/manager/0.log" Apr 17 17:43:16.800859 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:16.800824 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-xzn2j_6599bffa-b1ac-4ac4-9a3b-bba78c8d9b9d/openshift-lws-operator/0.log" Apr 17 17:43:21.031205 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:21.031173 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-n49zn_ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e/migrator/0.log" Apr 17 17:43:21.052409 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:21.052386 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-n49zn_ea9e3b3c-f443-44c5-bbb3-fdac356e6b2e/graceful-termination/0.log" Apr 17 17:43:22.340584 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.340554 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/kube-multus-additional-cni-plugins/0.log" Apr 17 17:43:22.360908 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.360882 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/egress-router-binary-copy/0.log" Apr 17 17:43:22.381556 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.381534 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/cni-plugins/0.log" Apr 17 17:43:22.403094 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.403075 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/bond-cni-plugin/0.log" Apr 17 17:43:22.426089 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.426063 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/routeoverride-cni/0.log" Apr 17 17:43:22.449025 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.448996 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/whereabouts-cni-bincopy/0.log" Apr 17 17:43:22.471132 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.471105 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-9c486_aed051f5-a966-45e4-867e-3841c1814af1/whereabouts-cni/0.log" Apr 17 17:43:22.844550 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.844477 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bhrfj_f40d551d-7b2b-4e50-afe2-fa8be6462803/kube-multus/0.log" Apr 17 17:43:22.974918 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.974885 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8zxs7_1dd8584b-d217-441a-a0d1-e1b86328dfe2/network-metrics-daemon/0.log" Apr 17 17:43:22.997843 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:22.997812 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8zxs7_1dd8584b-d217-441a-a0d1-e1b86328dfe2/kube-rbac-proxy/0.log" Apr 17 17:43:24.576956 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.576853 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-controller/0.log" Apr 17 17:43:24.596363 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.596316 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/0.log" Apr 17 17:43:24.609299 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.609265 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/ovn-acl-logging/1.log" Apr 17 17:43:24.635185 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.635148 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/kube-rbac-proxy-node/0.log" Apr 17 17:43:24.658342 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.658291 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/kube-rbac-proxy-ovn-metrics/0.log" Apr 17 17:43:24.676449 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.676421 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/northd/0.log" Apr 17 17:43:24.700104 ip-10-0-132-98 kubenswrapper[2574]: I0417 17:43:24.700073 2574 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-vbndw_fb032f94-bfb7-47c8-b2bb-9e3c7a412058/nbdb/0.log"