Apr 16 13:11:34.996145 ip-10-0-141-234 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:11:35.474758 ip-10-0-141-234 kubenswrapper[2571]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:35.474758 ip-10-0-141-234 kubenswrapper[2571]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:11:35.474758 ip-10-0-141-234 kubenswrapper[2571]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:35.474758 ip-10-0-141-234 kubenswrapper[2571]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:11:35.474758 ip-10-0-141-234 kubenswrapper[2571]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:11:35.476302 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.476210 2571 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:11:35.479522 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479508 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:35.479522 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479522 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479526 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479529 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479532 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479535 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479538 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479540 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479543 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479546 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479549 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479552 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479555 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479558 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479560 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479563 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479566 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479568 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479571 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479574 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479576 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:35.479589 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479579 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479583 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479586 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479589 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479592 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479595 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479598 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479601 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479604 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479607 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479610 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479613 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479615 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479618 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479620 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479623 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479626 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479628 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479631 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:35.480080 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479633 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479638 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479642 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479645 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479648 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479651 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479653 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479657 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479660 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479663 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479667 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479669 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479672 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479675 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479677 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479681 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479684 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479687 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479690 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479692 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:35.480549 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479695 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479698 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479701 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479703 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479706 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479709 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479712 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479715 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479717 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479720 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479722 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479725 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479728 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479731 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479735 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479739 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479742 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479745 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479748 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:35.481049 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479750 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479753 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479757 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479760 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479763 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479765 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.479768 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480165 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480171 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480174 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480178 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480180 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480183 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480186 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480189 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480191 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480194 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480197 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480200 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480203 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:35.481497 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480206 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480209 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480211 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480214 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480217 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480220 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480222 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480225 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480227 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480230 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480233 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480235 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480238 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480240 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480243 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480247 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480250 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480252 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480255 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480257 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:35.481983 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480261 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480264 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480266 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480269 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480272 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480274 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480277 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480280 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480283 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480285 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480288 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480291 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480294 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480297 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480299 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480301 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480304 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480306 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480309 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:35.482474 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480311 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480314 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480317 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480319 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480322 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480325 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480327 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480330 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480334 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480337 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480340 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480342 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480345 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480347 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480350 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480352 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480357 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480361 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480364 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:35.482974 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480367 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480369 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480372 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480375 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480377 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480380 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480382 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480385 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480388 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480390 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480394 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480398 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480401 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480404 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.480406 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481298 2571 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481306 2571 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481313 2571 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481318 2571 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481323 2571 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481326 2571 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:11:35.483448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481331 2571 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481336 2571 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481339 2571 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481342 2571 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481346 2571 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481349 2571 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481353 2571 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481356 2571 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481359 2571 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481362 2571 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481365 2571 flags.go:64] FLAG: --cloud-config="" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481368 2571 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481371 2571 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481375 2571 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481377 2571 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481381 2571 flags.go:64] FLAG: --config-dir="" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481384 2571 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481387 2571 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481391 2571 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481395 2571 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481398 2571 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481401 2571 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481405 2571 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481408 2571 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:11:35.483980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481411 2571 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481414 2571 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481417 2571 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481421 2571 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481424 2571 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481427 2571 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481430 2571 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481434 2571 flags.go:64] FLAG: --enable-server="true" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481437 2571 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481441 2571 flags.go:64] FLAG: --event-burst="100" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481444 2571 flags.go:64] FLAG: --event-qps="50" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481447 2571 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481451 2571 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481454 2571 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481458 2571 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481461 2571 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481464 2571 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481467 2571 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481470 2571 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481473 2571 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481476 2571 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481479 2571 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481482 2571 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481485 2571 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481487 2571 flags.go:64] FLAG: --feature-gates="" Apr 16 13:11:35.484583 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481491 2571 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481495 2571 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481498 2571 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481502 2571 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481505 2571 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481509 2571 flags.go:64] FLAG: --help="false" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481513 2571 flags.go:64] FLAG: --hostname-override="ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481516 2571 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481519 2571 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481522 2571 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481526 2571 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481529 2571 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481532 2571 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481535 2571 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481538 2571 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481541 2571 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481544 2571 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481547 2571 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481550 2571 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481553 2571 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481556 2571 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481559 2571 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481562 2571 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481565 2571 flags.go:64] FLAG: --lock-file="" Apr 16 13:11:35.485230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481568 2571 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481571 2571 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481573 2571 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481578 2571 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481581 2571 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481584 2571 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481587 2571 flags.go:64] FLAG: --logging-format="text" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481590 2571 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481593 2571 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481600 2571 flags.go:64] FLAG: --manifest-url="" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481603 2571 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481607 2571 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481610 2571 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481615 2571 flags.go:64] FLAG: --max-pods="110" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481618 2571 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481621 2571 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481624 2571 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481627 2571 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481630 2571 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481633 2571 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481636 2571 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481644 2571 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481647 2571 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481650 2571 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:11:35.485837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481653 2571 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481656 2571 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481662 2571 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481664 2571 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481668 2571 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481671 2571 flags.go:64] FLAG: --port="10250" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481674 2571 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481677 2571 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-030720a7412277dce" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481680 2571 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481683 2571 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481686 2571 flags.go:64] FLAG: --register-node="true" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481689 2571 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481691 2571 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481695 2571 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481698 2571 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481701 2571 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481704 2571 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481708 2571 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481713 2571 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481716 2571 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481719 2571 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481722 2571 flags.go:64] FLAG: --runonce="false" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481725 2571 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481728 2571 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481731 2571 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:11:35.486431 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481734 2571 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481738 2571 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481741 2571 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481744 2571 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481747 2571 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481750 2571 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481753 2571 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481756 2571 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481759 2571 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481762 2571 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481765 2571 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481768 2571 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481773 2571 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481776 2571 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481779 2571 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481783 2571 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481786 2571 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481789 2571 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481792 2571 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481794 2571 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481797 2571 flags.go:64] FLAG: --v="2" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481801 2571 flags.go:64] FLAG: --version="false" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481805 2571 flags.go:64] FLAG: --vmodule="" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481810 2571 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.481814 2571 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:11:35.487055 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481917 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481921 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481925 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481929 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481934 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481937 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481940 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481943 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481945 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481948 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481951 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481953 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481956 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481958 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481961 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481964 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481966 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481969 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481971 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:35.487686 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481974 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481976 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481979 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481982 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481984 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481987 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481990 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481992 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481995 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.481998 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482000 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482003 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482006 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482010 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482012 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482015 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482018 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482020 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482024 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482027 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:35.488169 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482030 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482032 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482035 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482038 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482040 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482043 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482045 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482048 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482052 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482055 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482058 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482060 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482063 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482066 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482068 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482071 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482073 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482076 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482078 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482081 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:35.488683 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482084 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482086 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482089 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482092 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482094 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482098 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482101 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482104 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482106 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482109 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482112 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482115 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482118 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482121 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482123 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482126 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482128 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482131 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482134 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482136 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:35.489222 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482140 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482143 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482147 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482150 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482154 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482156 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.482159 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.482965 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.489125 2571 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.489140 2571 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489185 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489190 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489194 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489196 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489199 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:35.489709 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489202 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489205 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489207 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489210 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489213 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489216 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489218 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489221 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489224 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489228 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489231 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489234 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489236 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489239 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489242 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489245 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489247 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489250 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489252 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489255 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:35.490096 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489258 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489260 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489264 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489268 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489271 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489274 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489277 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489280 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489283 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489285 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489288 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489291 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489294 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489296 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489299 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489302 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489304 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489307 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489309 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489312 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:35.490575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489315 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489318 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489321 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489323 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489326 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489328 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489331 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489333 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489336 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489338 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489342 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489346 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489349 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489352 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489355 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489358 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489361 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489364 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489368 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:35.491119 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489371 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489373 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489376 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489378 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489381 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489383 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489386 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489389 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489392 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489394 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489396 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489399 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489402 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489404 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489407 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489410 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489413 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489416 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489418 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489421 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:35.491575 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489423 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489426 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.489431 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489525 2571 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489529 2571 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489532 2571 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489535 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489538 2571 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489541 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489544 2571 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489546 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489549 2571 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489552 2571 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489555 2571 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489558 2571 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489560 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:11:35.492060 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489563 2571 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489565 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489568 2571 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489570 2571 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489573 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489575 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489578 2571 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489580 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489583 2571 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489586 2571 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489588 2571 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489590 2571 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489593 2571 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489596 2571 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489599 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489601 2571 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489604 2571 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489607 2571 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489609 2571 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489612 2571 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:11:35.492446 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489615 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489618 2571 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489620 2571 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489623 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489626 2571 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489628 2571 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489632 2571 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489636 2571 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489639 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489642 2571 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489645 2571 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489647 2571 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489650 2571 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489653 2571 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489657 2571 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489660 2571 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489662 2571 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489665 2571 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489668 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:11:35.492946 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489670 2571 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489673 2571 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489675 2571 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489678 2571 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489680 2571 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489683 2571 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489686 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489688 2571 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489690 2571 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489693 2571 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489696 2571 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489699 2571 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489701 2571 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489703 2571 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489707 2571 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489709 2571 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489712 2571 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489714 2571 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489717 2571 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489720 2571 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:11:35.493405 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489722 2571 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489725 2571 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489728 2571 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489730 2571 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489733 2571 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489735 2571 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489738 2571 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489740 2571 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489742 2571 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489745 2571 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489747 2571 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489750 2571 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489752 2571 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:35.489755 2571 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.489759 2571 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:11:35.493901 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.490592 2571 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:11:35.494331 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.492634 2571 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:11:35.494331 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.493679 2571 server.go:1019] "Starting client certificate rotation" Apr 16 13:11:35.494331 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.493795 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:11:35.494331 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.493838 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:11:35.521387 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.521366 2571 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:11:35.524756 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.524702 2571 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:11:35.538192 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.538173 2571 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:11:35.544780 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.544766 2571 log.go:25] "Validated CRI v1 image API" Apr 16 13:11:35.546628 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.546608 2571 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:11:35.550462 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.550438 2571 fs.go:135] Filesystem UUIDs: map[4fdb0234-4673-47ac-8763-98b3b335ba5e:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 fa87b6cc-b052-4b82-b586-3b3adde8e65a:/dev/nvme0n1p3] Apr 16 13:11:35.550524 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.550463 2571 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:11:35.551749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.551731 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:11:35.557459 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.557339 2571 manager.go:217] Machine: {Timestamp:2026-04-16 13:11:35.554269113 +0000 UTC m=+0.436625214 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100761 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27b14bc02bb576cb0c9ec756d6ad18 SystemUUID:ec27b14b-c02b-b576-cb0c-9ec756d6ad18 BootID:30d82fc9-2cb6-462f-b4bc-6a8c5ed3e323 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:51:8f:09:df:7f Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:51:8f:09:df:7f Speed:0 Mtu:9001} {Name:ovs-system MacAddress:8e:ba:6f:5a:97:d9 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:11:35.557459 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.557452 2571 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:11:35.557572 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.557526 2571 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:11:35.558587 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.558564 2571 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:11:35.558727 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.558589 2571 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-141-234.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:11:35.558775 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.558736 2571 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:11:35.558775 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.558744 2571 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:11:35.558775 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.558757 2571 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:11:35.559366 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.559354 2571 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:11:35.560743 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.560733 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:11:35.561039 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.561029 2571 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:11:35.563480 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.563471 2571 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:11:35.563514 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.563484 2571 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:11:35.563514 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.563495 2571 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:11:35.563514 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.563503 2571 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:11:35.563599 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.563516 2571 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:11:35.565376 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.565362 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:11:35.565450 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.565389 2571 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:11:35.568788 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.568767 2571 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:11:35.573293 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.573265 2571 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:11:35.574042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.574015 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q52rv" Apr 16 13:11:35.575280 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575267 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575285 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575291 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575297 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575303 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575309 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575314 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575320 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575334 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:11:35.575347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575340 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:11:35.575582 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575357 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:11:35.575582 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.575366 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:11:35.576207 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.576196 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:11:35.576259 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.576208 2571 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:11:35.576474 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.576451 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:11:35.576528 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.576506 2571 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-141-234.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:11:35.578827 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.578809 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-q52rv" Apr 16 13:11:35.580128 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.580114 2571 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:11:35.580196 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.580153 2571 server.go:1295] "Started kubelet" Apr 16 13:11:35.580796 ip-10-0-141-234 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:11:35.581035 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.580990 2571 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:11:35.581119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.581017 2571 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:11:35.581119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.581057 2571 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:11:35.582200 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.582186 2571 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:11:35.583581 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.583568 2571 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:11:35.588037 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.588019 2571 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:11:35.588037 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.588027 2571 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.588901 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.588907 2571 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.588944 2571 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.588960 2571 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589007 2571 factory.go:55] Registering systemd factory Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589025 2571 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589041 2571 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:11:35.589078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589049 2571 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:11:35.589415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589308 2571 factory.go:153] Registering CRI-O factory Apr 16 13:11:35.589415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589322 2571 factory.go:223] Registration of the crio container factory successfully Apr 16 13:11:35.589415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589396 2571 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:11:35.589513 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589423 2571 factory.go:103] Registering Raw factory Apr 16 13:11:35.589513 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.589437 2571 manager.go:1196] Started watching for new ooms in manager Apr 16 13:11:35.590262 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.590243 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:35.590941 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.590926 2571 manager.go:319] Starting recovery of all containers Apr 16 13:11:35.591918 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.591892 2571 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:11:35.595078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.595058 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-234.ec2.internal" not found Apr 16 13:11:35.595163 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.595109 2571 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-141-234.ec2.internal\" not found" node="ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.600289 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.599505 2571 manager.go:324] Recovery completed Apr 16 13:11:35.601514 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.601484 2571 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 13:11:35.604731 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.604719 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:35.608456 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.608442 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:35.608525 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.608467 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:35.608525 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.608477 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:35.608988 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.608973 2571 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:11:35.608988 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.608986 2571 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:11:35.609091 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.609000 2571 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:11:35.610905 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.610893 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-234.ec2.internal" not found Apr 16 13:11:35.611578 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.611566 2571 policy_none.go:49] "None policy: Start" Apr 16 13:11:35.611613 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.611583 2571 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:11:35.611613 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.611592 2571 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:11:35.649398 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649385 2571 manager.go:341] "Starting Device Plugin manager" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.649416 2571 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649425 2571 server.go:85] "Starting device plugin registration server" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649636 2571 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649649 2571 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649724 2571 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649814 2571 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.649825 2571 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.650274 2571 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:11:35.664222 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.650310 2571 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:35.666509 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.666496 2571 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-141-234.ec2.internal" not found Apr 16 13:11:35.718243 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.718214 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:11:35.719331 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.719317 2571 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:11:35.719399 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.719340 2571 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:11:35.719399 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.719355 2571 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:11:35.719399 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.719362 2571 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:11:35.719399 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.719390 2571 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:11:35.721588 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.721572 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:35.750139 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.750093 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:35.750841 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.750822 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:35.750938 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.750854 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:35.750938 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.750881 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:35.750938 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.750902 2571 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.759332 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.759317 2571 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.759398 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.759336 2571 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-141-234.ec2.internal\": node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:35.770988 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.770968 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:35.819734 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.819705 2571 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal"] Apr 16 13:11:35.819824 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.819803 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:35.820538 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.820520 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:35.820614 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.820547 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:35.820614 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.820561 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:35.821716 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.821704 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:35.821857 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.821843 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.821927 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.821888 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:35.822358 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.822341 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:35.822450 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.822345 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:35.822450 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.822401 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:35.822450 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.822417 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:35.822450 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.822372 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:35.822642 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.822454 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:35.824145 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.824128 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.824222 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.824160 2571 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:11:35.824807 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.824793 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:11:35.824900 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.824819 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:11:35.824900 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.824831 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:11:35.847557 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.847540 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-234.ec2.internal\" not found" node="ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.851950 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.851937 2571 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-141-234.ec2.internal\" not found" node="ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.871075 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.871062 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:35.891270 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.891254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/beda0b0d4b0cc2f07c0f706b0c22c573-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal\" (UID: \"beda0b0d4b0cc2f07c0f706b0c22c573\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.891320 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.891278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/beda0b0d4b0cc2f07c0f706b0c22c573-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal\" (UID: \"beda0b0d4b0cc2f07c0f706b0c22c573\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.891320 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.891302 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01d54e665ffc42a9ee67fe77b01ddd10-config\") pod \"kube-apiserver-proxy-ip-10-0-141-234.ec2.internal\" (UID: \"01d54e665ffc42a9ee67fe77b01ddd10\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.971338 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:35.971320 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:35.991730 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.991714 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/beda0b0d4b0cc2f07c0f706b0c22c573-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal\" (UID: \"beda0b0d4b0cc2f07c0f706b0c22c573\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.991776 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.991738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/beda0b0d4b0cc2f07c0f706b0c22c573-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal\" (UID: \"beda0b0d4b0cc2f07c0f706b0c22c573\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.991776 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.991753 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01d54e665ffc42a9ee67fe77b01ddd10-config\") pod \"kube-apiserver-proxy-ip-10-0-141-234.ec2.internal\" (UID: \"01d54e665ffc42a9ee67fe77b01ddd10\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.991835 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.991795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/beda0b0d4b0cc2f07c0f706b0c22c573-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal\" (UID: \"beda0b0d4b0cc2f07c0f706b0c22c573\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.991835 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.991811 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/beda0b0d4b0cc2f07c0f706b0c22c573-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal\" (UID: \"beda0b0d4b0cc2f07c0f706b0c22c573\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:35.991906 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:35.991795 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/01d54e665ffc42a9ee67fe77b01ddd10-config\") pod \"kube-apiserver-proxy-ip-10-0-141-234.ec2.internal\" (UID: \"01d54e665ffc42a9ee67fe77b01ddd10\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" Apr 16 13:11:36.072156 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.072106 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.151581 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.151556 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:36.155253 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.155214 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" Apr 16 13:11:36.173050 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.173027 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.273548 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.273519 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.374095 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.374039 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.474638 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.474613 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.494001 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.493986 2571 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:11:36.494419 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.494104 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:11:36.494419 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.494133 2571 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:11:36.575675 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.575647 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.580808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.580777 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:06:35 +0000 UTC" deadline="2027-11-29 11:24:56.763631882 +0000 UTC" Apr 16 13:11:36.580808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.580805 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14206h13m20.182830631s" Apr 16 13:11:36.588960 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.588940 2571 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:11:36.605926 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.605904 2571 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:11:36.628130 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.628087 2571 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-m79gq" Apr 16 13:11:36.633430 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.633411 2571 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-m79gq" Apr 16 13:11:36.647167 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:36.647135 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeda0b0d4b0cc2f07c0f706b0c22c573.slice/crio-a4dfa088c435fbf2f561a1e97dc6a9a83e6c2160a7dd945b292e335c5653e5dc WatchSource:0}: Error finding container a4dfa088c435fbf2f561a1e97dc6a9a83e6c2160a7dd945b292e335c5653e5dc: Status 404 returned error can't find the container with id a4dfa088c435fbf2f561a1e97dc6a9a83e6c2160a7dd945b292e335c5653e5dc Apr 16 13:11:36.649977 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:36.649959 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d54e665ffc42a9ee67fe77b01ddd10.slice/crio-9c990d79390544c94274ada5b2fdada5e654ccbe937c43e38e3936c024d9732e WatchSource:0}: Error finding container 9c990d79390544c94274ada5b2fdada5e654ccbe937c43e38e3936c024d9732e: Status 404 returned error can't find the container with id 9c990d79390544c94274ada5b2fdada5e654ccbe937c43e38e3936c024d9732e Apr 16 13:11:36.651654 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.651641 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:11:36.676042 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.676021 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.722124 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.722086 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" event={"ID":"01d54e665ffc42a9ee67fe77b01ddd10","Type":"ContainerStarted","Data":"9c990d79390544c94274ada5b2fdada5e654ccbe937c43e38e3936c024d9732e"} Apr 16 13:11:36.723022 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.723003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" event={"ID":"beda0b0d4b0cc2f07c0f706b0c22c573","Type":"ContainerStarted","Data":"a4dfa088c435fbf2f561a1e97dc6a9a83e6c2160a7dd945b292e335c5653e5dc"} Apr 16 13:11:36.776108 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:36.776087 2571 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-141-234.ec2.internal\" not found" Apr 16 13:11:36.870998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.870979 2571 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:36.889125 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.889072 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" Apr 16 13:11:36.899471 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.899452 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:11:36.901420 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.901408 2571 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" Apr 16 13:11:36.909078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:36.909063 2571 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:11:37.133337 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.133307 2571 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:37.565075 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.565046 2571 apiserver.go:52] "Watching apiserver" Apr 16 13:11:37.572688 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.572667 2571 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:11:37.573096 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.573073 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-nrklc","openshift-network-operator/iptables-alerter-q87wh","openshift-ovn-kubernetes/ovnkube-node-hdhfq","openshift-cluster-node-tuning-operator/tuned-qcgw5","openshift-dns/node-resolver-sprnh","openshift-multus/multus-sxgjd","openshift-multus/network-metrics-daemon-h8fnx","openshift-network-diagnostics/network-check-target-wm9fg","kube-system/konnectivity-agent-tpklv","kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk","openshift-image-registry/node-ca-njx5f","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal"] Apr 16 13:11:37.575100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.575079 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:37.575169 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.575148 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:37.576343 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.576325 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.577587 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.577570 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.578608 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.578577 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.578740 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.578716 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:11:37.578740 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.578730 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.578740 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.578737 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-72fwf\"" Apr 16 13:11:37.579996 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.579971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:11:37.579996 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.579978 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:11:37.580202 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.580034 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:11:37.580202 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.580036 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:11:37.580305 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.580206 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.580494 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.580356 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.581083 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.581065 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.581178 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.581144 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-mmsmj\"" Apr 16 13:11:37.581178 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.581161 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.581857 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.581829 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.582386 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.582367 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.582981 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.582961 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-pn8p4\"" Apr 16 13:11:37.583079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.582996 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-nfhhc\"" Apr 16 13:11:37.583079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.582963 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.583079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.583073 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.583252 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.583083 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.583396 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.583370 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.584135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.583832 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:11:37.584135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.583834 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.584135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.583917 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:11:37.584364 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.584187 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.584364 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.584191 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-d598s\"" Apr 16 13:11:37.585655 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.585638 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:11:37.585738 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.585701 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:11:37.586328 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.586289 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:37.586430 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.586347 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:37.586430 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.586366 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f5965\"" Apr 16 13:11:37.586532 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.586454 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.589504 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.588906 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:11:37.589504 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.589158 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:11:37.589504 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.589388 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-9d5jt\"" Apr 16 13:11:37.592098 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.592078 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.592236 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.592218 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.594118 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594096 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.594375 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594325 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:11:37.594454 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594430 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-xq9cv\"" Apr 16 13:11:37.594894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594638 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-p8gp2\"" Apr 16 13:11:37.594894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594652 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.594894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594702 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:11:37.594894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594770 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:11:37.594894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.594834 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:11:37.600922 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.600903 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-netns\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601014 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.600934 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:37.601014 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.600993 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-log-socket\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.601119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-run\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601054 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-conf-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601078 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-etc-kubernetes\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601100 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601125 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601150 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-sys-fs\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601174 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-systemd\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601198 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysconfig\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601223 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-host\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601246 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-cnibin\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601270 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krdx5\" (UniqueName: \"kubernetes.io/projected/5c95c4a9-7823-46cc-a24b-b1acc154ea64-kube-api-access-krdx5\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-ovnkube-script-lib\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601320 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-var-lib-kubelet\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-system-cni-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601375 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-cni-multus\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601398 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-multus-certs\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601441 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-os-release\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601498 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601553 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-etc-selinux\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601606 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-ovnkube-config\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-kubernetes\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601673 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-sys\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601700 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhxf8\" (UniqueName: \"kubernetes.io/projected/a7298d51-6776-4683-9b5e-23167bbd1794-kube-api-access-bhxf8\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4h5x\" (UniqueName: \"kubernetes.io/projected/38405315-7b9b-4c43-82bd-042c8486a193-kube-api-access-l4h5x\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601767 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-system-cni-dir\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.601800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601791 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b2f0cf7-6280-412b-b2df-63712a2a8772-konnectivity-ca\") pod \"konnectivity-agent-tpklv\" (UID: \"1b2f0cf7-6280-412b-b2df-63712a2a8772\") " pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601813 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-registration-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601836 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9bst\" (UniqueName: \"kubernetes.io/projected/8921cd74-6824-4c24-896a-5c649cefc5da-kube-api-access-p9bst\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601886 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-env-overrides\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a7298d51-6776-4683-9b5e-23167bbd1794-etc-tuned\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601942 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r847q\" (UniqueName: \"kubernetes.io/projected/f57b15af-9441-4822-9c41-048d94ab4c1a-kube-api-access-r847q\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.601969 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/446804f0-271f-479a-89e6-b8b25ec2e701-tmp-dir\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysctl-conf\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602051 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-socket-dir-parent\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602081 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-socket-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602104 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-run-netns\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602128 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-ovn\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602149 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-cni-bin\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602166 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-cni-netd\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602187 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-978zg\" (UniqueName: \"kubernetes.io/projected/5094bf87-3c60-48ea-8c6b-c241aeb55c29-kube-api-access-978zg\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602209 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38405315-7b9b-4c43-82bd-042c8486a193-cni-binary-copy\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.602408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-k8s-cni-cncf-io\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602279 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-modprobe-d\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602301 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/38405315-7b9b-4c43-82bd-042c8486a193-multus-daemon-config\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602323 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-os-release\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602368 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-slash\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602418 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-node-log\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602452 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/446804f0-271f-479a-89e6-b8b25ec2e701-hosts-file\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602480 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602524 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5094bf87-3c60-48ea-8c6b-c241aeb55c29-iptables-alerter-script\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602542 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-hostroot\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-systemd-units\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602570 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c031fad3-56b2-4030-9fb7-11cd3421145d-ovn-node-metrics-cert\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602584 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6rfz\" (UniqueName: \"kubernetes.io/projected/c031fad3-56b2-4030-9fb7-11cd3421145d-kube-api-access-b6rfz\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602600 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-cnibin\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602624 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-var-lib-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-systemd\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602672 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-lib-modules\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602696 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7298d51-6776-4683-9b5e-23167bbd1794-tmp\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602711 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b2f0cf7-6280-412b-b2df-63712a2a8772-agent-certs\") pod \"konnectivity-agent-tpklv\" (UID: \"1b2f0cf7-6280-412b-b2df-63712a2a8772\") " pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602725 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-kubelet-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602756 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-etc-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602799 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5094bf87-3c60-48ea-8c6b-c241aeb55c29-host-slash\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602823 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602843 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-kubelet\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602883 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-cni-bin\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602909 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-kubelet\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602933 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-device-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602955 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.602979 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.603004 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lnn4\" (UniqueName: \"kubernetes.io/projected/446804f0-271f-479a-89e6-b8b25ec2e701-kube-api-access-7lnn4\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.603027 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysctl-d\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.603602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.603048 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-cni-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.634306 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.634277 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:06:36 +0000 UTC" deadline="2027-10-10 07:31:12.691034328 +0000 UTC" Apr 16 13:11:37.634409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.634306 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13002h19m35.056731781s" Apr 16 13:11:37.659735 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.659702 2571 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:37.690137 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.690119 2571 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:11:37.704147 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704119 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-systemd-units\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704255 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704155 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c031fad3-56b2-4030-9fb7-11cd3421145d-ovn-node-metrics-cert\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704255 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6rfz\" (UniqueName: \"kubernetes.io/projected/c031fad3-56b2-4030-9fb7-11cd3421145d-kube-api-access-b6rfz\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704255 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704204 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-cnibin\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.704255 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-var-lib-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704255 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704251 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-systemd\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704254 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-systemd-units\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704274 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-lib-modules\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704298 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7298d51-6776-4683-9b5e-23167bbd1794-tmp\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704307 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-var-lib-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704326 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03556f6f-6803-432c-8012-c40eb6e388ad-serviceca\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-cnibin\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b2f0cf7-6280-412b-b2df-63712a2a8772-agent-certs\") pod \"konnectivity-agent-tpklv\" (UID: \"1b2f0cf7-6280-412b-b2df-63712a2a8772\") " pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704376 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-kubelet-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-etc-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5094bf87-3c60-48ea-8c6b-c241aeb55c29-host-slash\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704437 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-lib-modules\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704474 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-kubelet\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704473 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-systemd\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.704508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-cni-bin\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-kubelet\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704542 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-kubelet-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704556 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-device-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704599 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704600 2571 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704633 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704659 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-kubelet\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704661 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lnn4\" (UniqueName: \"kubernetes.io/projected/446804f0-271f-479a-89e6-b8b25ec2e701-kube-api-access-7lnn4\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704704 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysctl-d\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704731 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-cni-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-device-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704755 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-netns\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704780 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704806 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-etc-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704823 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-log-socket\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704848 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5094bf87-3c60-48ea-8c6b-c241aeb55c29-host-slash\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-run\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.705233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704911 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-conf-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704942 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-etc-kubernetes\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-netns\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704914 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-run\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705012 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-etc-kubernetes\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705019 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-log-socket\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-cni-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705061 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-run-ovn-kubernetes\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705099 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-kubelet\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705134 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-conf-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.705137 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705172 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.705233 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:38.205205219 +0000 UTC m=+3.087561288 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705252 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705278 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-sys-fs\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705297 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-systemd\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705316 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysconfig\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705335 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-host\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705353 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-cnibin\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-krdx5\" (UniqueName: \"kubernetes.io/projected/5c95c4a9-7823-46cc-a24b-b1acc154ea64-kube-api-access-krdx5\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705392 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-sys-fs\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705399 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-ovnkube-script-lib\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705424 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-var-lib-kubelet\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705427 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysctl-d\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705449 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-system-cni-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705473 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-cni-multus\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705475 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-cni-binary-copy\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705495 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-multus-certs\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705523 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-os-release\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705530 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-systemd\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705546 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705569 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-etc-selinux\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705577 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-host\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.706945 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705585 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-system-cni-dir\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-ovnkube-config\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705617 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-kubernetes\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705630 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysconfig\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705637 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-sys\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705657 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bhxf8\" (UniqueName: \"kubernetes.io/projected/a7298d51-6776-4683-9b5e-23167bbd1794-kube-api-access-bhxf8\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705678 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4h5x\" (UniqueName: \"kubernetes.io/projected/38405315-7b9b-4c43-82bd-042c8486a193-kube-api-access-l4h5x\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705702 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-system-cni-dir\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b2f0cf7-6280-412b-b2df-63712a2a8772-konnectivity-ca\") pod \"konnectivity-agent-tpklv\" (UID: \"1b2f0cf7-6280-412b-b2df-63712a2a8772\") " pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705744 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-registration-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705767 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9bst\" (UniqueName: \"kubernetes.io/projected/8921cd74-6824-4c24-896a-5c649cefc5da-kube-api-access-p9bst\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-env-overrides\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705814 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a7298d51-6776-4683-9b5e-23167bbd1794-etc-tuned\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705818 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-kubernetes\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705841 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjphm\" (UniqueName: \"kubernetes.io/projected/03556f6f-6803-432c-8012-c40eb6e388ad-kube-api-access-hjphm\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705846 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r847q\" (UniqueName: \"kubernetes.io/projected/f57b15af-9441-4822-9c41-048d94ab4c1a-kube-api-access-r847q\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:37.707844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705927 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/446804f0-271f-479a-89e6-b8b25ec2e701-tmp-dir\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705942 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-multus-certs\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705951 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysctl-conf\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705978 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-socket-dir-parent\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706004 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-socket-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706026 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-etc-selinux\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706032 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-run-netns\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705496 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-cnibin\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-ovn\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706078 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-cni-bin\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-cni-netd\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706140 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-978zg\" (UniqueName: \"kubernetes.io/projected/5094bf87-3c60-48ea-8c6b-c241aeb55c29-kube-api-access-978zg\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706167 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38405315-7b9b-4c43-82bd-042c8486a193-cni-binary-copy\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706190 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-sys\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706194 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-k8s-cni-cncf-io\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706231 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-ovnkube-script-lib\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706240 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-run-k8s-cni-cncf-io\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.705887 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-cni-multus\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.708664 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706272 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706300 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-ovn\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-modprobe-d\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706325 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-os-release\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706340 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/38405315-7b9b-4c43-82bd-042c8486a193-multus-daemon-config\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706343 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-cni-bin\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706365 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-ovnkube-config\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706370 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-os-release\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706423 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-cni-netd\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706452 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-sysctl-conf\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8921cd74-6824-4c24-896a-5c649cefc5da-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.704617 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-host-var-lib-cni-bin\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706141 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-var-lib-kubelet\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706544 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03556f6f-6803-432c-8012-c40eb6e388ad-host\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706618 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-run-openvswitch\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706618 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-multus-socket-dir-parent\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706690 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-os-release\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706704 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-socket-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.709394 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706725 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-system-cni-dir\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706754 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-run-netns\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.706918 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/38405315-7b9b-4c43-82bd-042c8486a193-multus-daemon-config\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707151 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/1b2f0cf7-6280-412b-b2df-63712a2a8772-konnectivity-ca\") pod \"konnectivity-agent-tpklv\" (UID: \"1b2f0cf7-6280-412b-b2df-63712a2a8772\") " pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707171 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c031fad3-56b2-4030-9fb7-11cd3421145d-env-overrides\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707177 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8921cd74-6824-4c24-896a-5c649cefc5da-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707205 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5c95c4a9-7823-46cc-a24b-b1acc154ea64-registration-dir\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707229 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-slash\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707256 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-node-log\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707264 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a7298d51-6776-4683-9b5e-23167bbd1794-etc-modprobe-d\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/446804f0-271f-479a-89e6-b8b25ec2e701-hosts-file\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707348 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-host-slash\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707357 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c031fad3-56b2-4030-9fb7-11cd3421145d-node-log\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707332 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/446804f0-271f-479a-89e6-b8b25ec2e701-hosts-file\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707359 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5094bf87-3c60-48ea-8c6b-c241aeb55c29-iptables-alerter-script\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707402 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-hostroot\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707546 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/38405315-7b9b-4c43-82bd-042c8486a193-hostroot\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.709904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707648 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/446804f0-271f-479a-89e6-b8b25ec2e701-tmp-dir\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.710409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/38405315-7b9b-4c43-82bd-042c8486a193-cni-binary-copy\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.710409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.707888 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5094bf87-3c60-48ea-8c6b-c241aeb55c29-iptables-alerter-script\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.710409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.708146 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a7298d51-6776-4683-9b5e-23167bbd1794-tmp\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.710409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.708193 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c031fad3-56b2-4030-9fb7-11cd3421145d-ovn-node-metrics-cert\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.710409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.708786 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a7298d51-6776-4683-9b5e-23167bbd1794-etc-tuned\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.710409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.709272 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/1b2f0cf7-6280-412b-b2df-63712a2a8772-agent-certs\") pod \"konnectivity-agent-tpklv\" (UID: \"1b2f0cf7-6280-412b-b2df-63712a2a8772\") " pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.714380 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.714357 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-krdx5\" (UniqueName: \"kubernetes.io/projected/5c95c4a9-7823-46cc-a24b-b1acc154ea64-kube-api-access-krdx5\") pod \"aws-ebs-csi-driver-node-922jk\" (UID: \"5c95c4a9-7823-46cc-a24b-b1acc154ea64\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:37.714992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.714959 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6rfz\" (UniqueName: \"kubernetes.io/projected/c031fad3-56b2-4030-9fb7-11cd3421145d-kube-api-access-b6rfz\") pod \"ovnkube-node-hdhfq\" (UID: \"c031fad3-56b2-4030-9fb7-11cd3421145d\") " pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.714992 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.714973 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:37.714992 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.714991 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:37.715196 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.715003 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:37.715196 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:37.715055 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:38.215038515 +0000 UTC m=+3.097394584 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:37.715628 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.715422 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lnn4\" (UniqueName: \"kubernetes.io/projected/446804f0-271f-479a-89e6-b8b25ec2e701-kube-api-access-7lnn4\") pod \"node-resolver-sprnh\" (UID: \"446804f0-271f-479a-89e6-b8b25ec2e701\") " pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.717585 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.717545 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9bst\" (UniqueName: \"kubernetes.io/projected/8921cd74-6824-4c24-896a-5c649cefc5da-kube-api-access-p9bst\") pod \"multus-additional-cni-plugins-nrklc\" (UID: \"8921cd74-6824-4c24-896a-5c649cefc5da\") " pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.717666 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.717611 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4h5x\" (UniqueName: \"kubernetes.io/projected/38405315-7b9b-4c43-82bd-042c8486a193-kube-api-access-l4h5x\") pod \"multus-sxgjd\" (UID: \"38405315-7b9b-4c43-82bd-042c8486a193\") " pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.717815 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.717797 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r847q\" (UniqueName: \"kubernetes.io/projected/f57b15af-9441-4822-9c41-048d94ab4c1a-kube-api-access-r847q\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:37.718049 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.718033 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-978zg\" (UniqueName: \"kubernetes.io/projected/5094bf87-3c60-48ea-8c6b-c241aeb55c29-kube-api-access-978zg\") pod \"iptables-alerter-q87wh\" (UID: \"5094bf87-3c60-48ea-8c6b-c241aeb55c29\") " pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.718139 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.718101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhxf8\" (UniqueName: \"kubernetes.io/projected/a7298d51-6776-4683-9b5e-23167bbd1794-kube-api-access-bhxf8\") pod \"tuned-qcgw5\" (UID: \"a7298d51-6776-4683-9b5e-23167bbd1794\") " pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.808564 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.808532 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03556f6f-6803-432c-8012-c40eb6e388ad-serviceca\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.808728 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.808591 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hjphm\" (UniqueName: \"kubernetes.io/projected/03556f6f-6803-432c-8012-c40eb6e388ad-kube-api-access-hjphm\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.808728 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.808622 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03556f6f-6803-432c-8012-c40eb6e388ad-host\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.808728 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.808678 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/03556f6f-6803-432c-8012-c40eb6e388ad-host\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.809013 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.808986 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/03556f6f-6803-432c-8012-c40eb6e388ad-serviceca\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.816179 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.816130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjphm\" (UniqueName: \"kubernetes.io/projected/03556f6f-6803-432c-8012-c40eb6e388ad-kube-api-access-hjphm\") pod \"node-ca-njx5f\" (UID: \"03556f6f-6803-432c-8012-c40eb6e388ad\") " pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.891191 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.891160 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-q87wh" Apr 16 13:11:37.896889 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.896858 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-sprnh" Apr 16 13:11:37.906446 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.906426 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:11:37.911185 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.911170 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" Apr 16 13:11:37.915072 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.915055 2571 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:11:37.916984 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.916967 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-sxgjd" Apr 16 13:11:37.922497 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.922483 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nrklc" Apr 16 13:11:37.929024 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.929007 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:37.935540 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.935525 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-njx5f" Apr 16 13:11:37.938610 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:37.938594 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" Apr 16 13:11:38.063179 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.063140 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-md8rb"] Apr 16 13:11:38.064999 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.064974 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.065129 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.065048 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:38.111318 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.111256 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9da276b-7d39-4d22-8b85-9d91a9a39f32-kubelet-config\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.111318 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.111286 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9da276b-7d39-4d22-8b85-9d91a9a39f32-dbus\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.111472 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.111324 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.212580 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.212552 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:38.212739 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.212590 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9da276b-7d39-4d22-8b85-9d91a9a39f32-kubelet-config\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.212739 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.212608 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9da276b-7d39-4d22-8b85-9d91a9a39f32-dbus\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.212739 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.212636 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.212739 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.212705 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:38.212739 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.212731 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:38.213064 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.212773 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:39.21275175 +0000 UTC m=+4.095107820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:38.213064 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.212794 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret podName:a9da276b-7d39-4d22-8b85-9d91a9a39f32 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:38.712784158 +0000 UTC m=+3.595140233 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret") pod "global-pull-secret-syncer-md8rb" (UID: "a9da276b-7d39-4d22-8b85-9d91a9a39f32") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:38.213064 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.212798 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/a9da276b-7d39-4d22-8b85-9d91a9a39f32-kubelet-config\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.213064 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.212810 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/a9da276b-7d39-4d22-8b85-9d91a9a39f32-dbus\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.231711 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.231688 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8921cd74_6824_4c24_896a_5c649cefc5da.slice/crio-7828ddaecab302810429f6ed4675eba2dad4268b80e87a4d421ff5eccea0f977 WatchSource:0}: Error finding container 7828ddaecab302810429f6ed4675eba2dad4268b80e87a4d421ff5eccea0f977: Status 404 returned error can't find the container with id 7828ddaecab302810429f6ed4675eba2dad4268b80e87a4d421ff5eccea0f977 Apr 16 13:11:38.232731 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.232708 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5094bf87_3c60_48ea_8c6b_c241aeb55c29.slice/crio-c0cb9044025982be81687fafb8138c466e7c8192d09c89454a63df600d7f3d5b WatchSource:0}: Error finding container c0cb9044025982be81687fafb8138c466e7c8192d09c89454a63df600d7f3d5b: Status 404 returned error can't find the container with id c0cb9044025982be81687fafb8138c466e7c8192d09c89454a63df600d7f3d5b Apr 16 13:11:38.233642 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.233520 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod03556f6f_6803_432c_8012_c40eb6e388ad.slice/crio-9c4b8edaa3e0024ac1d5632c7d9cc2be9ec2af6de5aa18af2dd93afae8867e21 WatchSource:0}: Error finding container 9c4b8edaa3e0024ac1d5632c7d9cc2be9ec2af6de5aa18af2dd93afae8867e21: Status 404 returned error can't find the container with id 9c4b8edaa3e0024ac1d5632c7d9cc2be9ec2af6de5aa18af2dd93afae8867e21 Apr 16 13:11:38.235661 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.234623 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc031fad3_56b2_4030_9fb7_11cd3421145d.slice/crio-f5c88aabec9f84455bef8f158e39170bd318db399cc42e5beb9560b9b3d43103 WatchSource:0}: Error finding container f5c88aabec9f84455bef8f158e39170bd318db399cc42e5beb9560b9b3d43103: Status 404 returned error can't find the container with id f5c88aabec9f84455bef8f158e39170bd318db399cc42e5beb9560b9b3d43103 Apr 16 13:11:38.238317 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.238296 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7298d51_6776_4683_9b5e_23167bbd1794.slice/crio-812c06a05cbe20f98b4cd09c8ff8a865c438aed3226b8fc677dc3fecf349deea WatchSource:0}: Error finding container 812c06a05cbe20f98b4cd09c8ff8a865c438aed3226b8fc677dc3fecf349deea: Status 404 returned error can't find the container with id 812c06a05cbe20f98b4cd09c8ff8a865c438aed3226b8fc677dc3fecf349deea Apr 16 13:11:38.239656 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.239638 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c95c4a9_7823_46cc_a24b_b1acc154ea64.slice/crio-c0e70f72dd3766ff4bf51ec4aab3d429a3e187ee888be79a10db447136221c6a WatchSource:0}: Error finding container c0e70f72dd3766ff4bf51ec4aab3d429a3e187ee888be79a10db447136221c6a: Status 404 returned error can't find the container with id c0e70f72dd3766ff4bf51ec4aab3d429a3e187ee888be79a10db447136221c6a Apr 16 13:11:38.240493 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.240473 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod446804f0_271f_479a_89e6_b8b25ec2e701.slice/crio-eda0fbd33f89b10b52e67fa3214d1243226fa394383f22510c347a9e9ac5376d WatchSource:0}: Error finding container eda0fbd33f89b10b52e67fa3214d1243226fa394383f22510c347a9e9ac5376d: Status 404 returned error can't find the container with id eda0fbd33f89b10b52e67fa3214d1243226fa394383f22510c347a9e9ac5376d Apr 16 13:11:38.241203 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.241085 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b2f0cf7_6280_412b_b2df_63712a2a8772.slice/crio-91ced51ab5b9368ef8068bde5b3160791fd035a1ecc132a20218abdde44a6769 WatchSource:0}: Error finding container 91ced51ab5b9368ef8068bde5b3160791fd035a1ecc132a20218abdde44a6769: Status 404 returned error can't find the container with id 91ced51ab5b9368ef8068bde5b3160791fd035a1ecc132a20218abdde44a6769 Apr 16 13:11:38.243569 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:11:38.242681 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38405315_7b9b_4c43_82bd_042c8486a193.slice/crio-5fd7c5900028cd4a2249f8d894235413039906954d546e20a3d47b750685e868 WatchSource:0}: Error finding container 5fd7c5900028cd4a2249f8d894235413039906954d546e20a3d47b750685e868: Status 404 returned error can't find the container with id 5fd7c5900028cd4a2249f8d894235413039906954d546e20a3d47b750685e868 Apr 16 13:11:38.313716 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.313540 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:38.313716 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.313693 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:38.313716 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.313714 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:38.313881 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.313726 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:38.313881 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.313779 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:39.313760228 +0000 UTC m=+4.196116298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:38.634782 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.634736 2571 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:06:36 +0000 UTC" deadline="2027-11-01 14:43:34.636407751 +0000 UTC" Apr 16 13:11:38.634782 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.634779 2571 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13537h31m56.001637899s" Apr 16 13:11:38.716619 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.716013 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:38.716619 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.716202 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:38.716619 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.716264 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret podName:a9da276b-7d39-4d22-8b85-9d91a9a39f32 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:39.716245807 +0000 UTC m=+4.598601880 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret") pod "global-pull-secret-syncer-md8rb" (UID: "a9da276b-7d39-4d22-8b85-9d91a9a39f32") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:38.720611 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.720215 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:38.720611 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.720412 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:38.720611 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.720503 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:38.720611 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:38.720574 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:38.741343 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.741289 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sxgjd" event={"ID":"38405315-7b9b-4c43-82bd-042c8486a193","Type":"ContainerStarted","Data":"5fd7c5900028cd4a2249f8d894235413039906954d546e20a3d47b750685e868"} Apr 16 13:11:38.767749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.767710 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tpklv" event={"ID":"1b2f0cf7-6280-412b-b2df-63712a2a8772","Type":"ContainerStarted","Data":"91ced51ab5b9368ef8068bde5b3160791fd035a1ecc132a20218abdde44a6769"} Apr 16 13:11:38.774534 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.774484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" event={"ID":"a7298d51-6776-4683-9b5e-23167bbd1794","Type":"ContainerStarted","Data":"812c06a05cbe20f98b4cd09c8ff8a865c438aed3226b8fc677dc3fecf349deea"} Apr 16 13:11:38.784831 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.784779 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"f5c88aabec9f84455bef8f158e39170bd318db399cc42e5beb9560b9b3d43103"} Apr 16 13:11:38.796005 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.795954 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-njx5f" event={"ID":"03556f6f-6803-432c-8012-c40eb6e388ad","Type":"ContainerStarted","Data":"9c4b8edaa3e0024ac1d5632c7d9cc2be9ec2af6de5aa18af2dd93afae8867e21"} Apr 16 13:11:38.801139 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.801086 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" event={"ID":"5c95c4a9-7823-46cc-a24b-b1acc154ea64","Type":"ContainerStarted","Data":"c0e70f72dd3766ff4bf51ec4aab3d429a3e187ee888be79a10db447136221c6a"} Apr 16 13:11:38.802989 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.802939 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sprnh" event={"ID":"446804f0-271f-479a-89e6-b8b25ec2e701","Type":"ContainerStarted","Data":"eda0fbd33f89b10b52e67fa3214d1243226fa394383f22510c347a9e9ac5376d"} Apr 16 13:11:38.807849 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.807812 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q87wh" event={"ID":"5094bf87-3c60-48ea-8c6b-c241aeb55c29","Type":"ContainerStarted","Data":"c0cb9044025982be81687fafb8138c466e7c8192d09c89454a63df600d7f3d5b"} Apr 16 13:11:38.818043 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.818019 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerStarted","Data":"7828ddaecab302810429f6ed4675eba2dad4268b80e87a4d421ff5eccea0f977"} Apr 16 13:11:38.825809 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:38.825787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" event={"ID":"01d54e665ffc42a9ee67fe77b01ddd10","Type":"ContainerStarted","Data":"cac2868819ec906454766493ff513f8274ce5ef1bf6bc454a31ea5f8f9de8e22"} Apr 16 13:11:39.222236 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.221526 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:39.222236 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.221675 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:39.222236 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.221736 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:41.221716796 +0000 UTC m=+6.104072872 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:39.322396 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.322324 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:39.322542 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.322476 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:39.322542 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.322496 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:39.322542 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.322509 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:39.322696 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.322566 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:41.322548025 +0000 UTC m=+6.204904098 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:39.722822 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.722742 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:39.723252 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.722892 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:39.725161 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.725126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:39.725289 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.725261 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:39.725346 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:39.725314 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret podName:a9da276b-7d39-4d22-8b85-9d91a9a39f32 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:41.725296775 +0000 UTC m=+6.607652844 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret") pod "global-pull-secret-syncer-md8rb" (UID: "a9da276b-7d39-4d22-8b85-9d91a9a39f32") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:39.856384 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.856348 2571 generic.go:358] "Generic (PLEG): container finished" podID="beda0b0d4b0cc2f07c0f706b0c22c573" containerID="6947d5cda9020559bdd3adf421d8510f91bc90e8436567b939fb2b97181d760f" exitCode=0 Apr 16 13:11:39.857305 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.857276 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" event={"ID":"beda0b0d4b0cc2f07c0f706b0c22c573","Type":"ContainerDied","Data":"6947d5cda9020559bdd3adf421d8510f91bc90e8436567b939fb2b97181d760f"} Apr 16 13:11:39.873594 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:39.873534 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-141-234.ec2.internal" podStartSLOduration=3.873517128 podStartE2EDuration="3.873517128s" podCreationTimestamp="2026-04-16 13:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:11:38.838986208 +0000 UTC m=+3.721342298" watchObservedRunningTime="2026-04-16 13:11:39.873517128 +0000 UTC m=+4.755873219" Apr 16 13:11:40.720601 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:40.719989 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:40.720601 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:40.720005 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:40.720601 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:40.720113 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:40.720601 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:40.720559 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:40.868956 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:40.868683 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" event={"ID":"beda0b0d4b0cc2f07c0f706b0c22c573","Type":"ContainerStarted","Data":"8b78de4b2fc4651ceb14422dbdea817d7643db56a202f43053ceb4b01b3d60a3"} Apr 16 13:11:40.882303 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:40.882167 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-141-234.ec2.internal" podStartSLOduration=4.882148973 podStartE2EDuration="4.882148973s" podCreationTimestamp="2026-04-16 13:11:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:11:40.880988341 +0000 UTC m=+5.763344431" watchObservedRunningTime="2026-04-16 13:11:40.882148973 +0000 UTC m=+5.764505064" Apr 16 13:11:41.240325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:41.240294 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:41.240488 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.240421 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:41.240488 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.240485 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:45.240465779 +0000 UTC m=+10.122821869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:41.341086 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:41.341029 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:41.344902 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.342458 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:41.344902 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.342510 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:41.344902 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.342530 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:41.344902 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.342607 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:45.342588094 +0000 UTC m=+10.224944177 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:41.720323 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:41.720238 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:41.720481 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.720357 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:41.744951 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:41.744877 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:41.745087 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.745022 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:41.745157 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:41.745093 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret podName:a9da276b-7d39-4d22-8b85-9d91a9a39f32 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:45.745074809 +0000 UTC m=+10.627430882 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret") pod "global-pull-secret-syncer-md8rb" (UID: "a9da276b-7d39-4d22-8b85-9d91a9a39f32") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:42.719896 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:42.719718 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:42.719896 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:42.719836 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:42.719896 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:42.719862 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:42.720432 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:42.719995 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:43.720451 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:43.720411 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:43.720912 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:43.720551 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:44.720428 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:44.719777 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:44.720428 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:44.719930 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:44.720428 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:44.720298 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:44.720428 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:44.720383 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:45.273828 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:45.273783 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:45.274028 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.273959 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:45.274028 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.274027 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:11:53.274005618 +0000 UTC m=+18.156361708 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:45.374616 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:45.374580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:45.374781 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.374713 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:45.374781 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.374741 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:45.374781 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.374753 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:45.374976 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.374824 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:53.374803733 +0000 UTC m=+18.257159820 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:45.721242 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:45.721165 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:45.721654 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.721277 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:45.777619 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:45.777580 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:45.777788 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.777766 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:45.777856 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:45.777842 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret podName:a9da276b-7d39-4d22-8b85-9d91a9a39f32 nodeName:}" failed. No retries permitted until 2026-04-16 13:11:53.777821842 +0000 UTC m=+18.660177915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret") pod "global-pull-secret-syncer-md8rb" (UID: "a9da276b-7d39-4d22-8b85-9d91a9a39f32") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:46.719996 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:46.719965 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:46.719996 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:46.719993 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:46.720313 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:46.720102 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:46.720313 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:46.720208 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:47.719846 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:47.719807 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:47.720275 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:47.719954 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:48.719660 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:48.719633 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:48.719660 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:48.719647 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:48.719910 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:48.719750 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:48.719910 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:48.719885 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:49.719826 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:49.719793 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:49.720049 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:49.719938 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:50.720277 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:50.720243 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:50.720716 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:50.720249 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:50.720716 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:50.720364 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:50.720716 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:50.720461 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:51.719830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:51.719801 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:51.720015 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:51.719928 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:52.719736 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:52.719700 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:52.720273 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:52.719700 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:52.720273 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:52.719811 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:52.720273 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:52.719916 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:53.340667 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:53.340634 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:53.340931 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.340780 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:53.340931 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.340852 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:12:09.340829331 +0000 UTC m=+34.223185403 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:11:53.441643 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:53.441610 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:53.441814 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.441789 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:11:53.441904 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.441814 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:11:53.441904 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.441826 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:53.442010 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.441905 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:09.441884747 +0000 UTC m=+34.324240816 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:11:53.720048 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:53.719971 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:53.720467 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.720089 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:53.845163 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:53.845129 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:53.845319 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.845238 2571 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:53.845319 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:53.845291 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret podName:a9da276b-7d39-4d22-8b85-9d91a9a39f32 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:09.845274839 +0000 UTC m=+34.727630921 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret") pod "global-pull-secret-syncer-md8rb" (UID: "a9da276b-7d39-4d22-8b85-9d91a9a39f32") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:11:54.719761 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:54.719732 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:54.719919 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:54.719734 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:54.719919 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:54.719823 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:54.719996 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:54.719938 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:55.720613 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.720369 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:55.721268 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:55.720688 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:55.893287 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.893186 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-tpklv" event={"ID":"1b2f0cf7-6280-412b-b2df-63712a2a8772","Type":"ContainerStarted","Data":"2588b462d1ec2d2c8b90a79fba883130482db79b13856fdc4ac7e42ea5b47b17"} Apr 16 13:11:55.894620 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.894590 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" event={"ID":"a7298d51-6776-4683-9b5e-23167bbd1794","Type":"ContainerStarted","Data":"43da83c8224d976db026b7dd28e517797aa1cb5b23fc82a53424a4915b84ba06"} Apr 16 13:11:55.897462 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.897431 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:11:55.897976 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.897951 2571 generic.go:358] "Generic (PLEG): container finished" podID="c031fad3-56b2-4030-9fb7-11cd3421145d" containerID="5a94b0548451bd976e4a7d7e5d883e9efd036304af02307ef8c5088dd4016ee1" exitCode=1 Apr 16 13:11:55.898065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.898013 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"495ec4578e01731bdb21c59c1127a05d808867fa8287941d18628a6ceaba1a7c"} Apr 16 13:11:55.898065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.898035 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"1872b248623db001643adf507c6521b942ab72143843514c19fb035fbcde28ba"} Apr 16 13:11:55.898065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.898048 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"d535b659c9075e7297809a678efdeafdc697879f851731d7e7f5c349578108c3"} Apr 16 13:11:55.898065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.898060 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"2aa6fe066657d52c4d1fcfc1c2080531a4d1196a057f9a24faa5f3ad88fa7a54"} Apr 16 13:11:55.898248 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.898074 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerDied","Data":"5a94b0548451bd976e4a7d7e5d883e9efd036304af02307ef8c5088dd4016ee1"} Apr 16 13:11:55.898248 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.898088 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"6c16237fbc8aec48390b702da4197b7ca6d0eee33483830119bd385e50a64b67"} Apr 16 13:11:55.900171 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.900142 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-njx5f" event={"ID":"03556f6f-6803-432c-8012-c40eb6e388ad","Type":"ContainerStarted","Data":"5a8566abb66b57df8320c250e51c64238b3bd0424e3ccea002fc5e1d7f3b776f"} Apr 16 13:11:55.901505 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.901484 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" event={"ID":"5c95c4a9-7823-46cc-a24b-b1acc154ea64","Type":"ContainerStarted","Data":"9c72406a8c4f8e1675284c70bf16a990b3f88f617652654c15ab5c8d2705f90a"} Apr 16 13:11:55.902743 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.902711 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-sprnh" event={"ID":"446804f0-271f-479a-89e6-b8b25ec2e701","Type":"ContainerStarted","Data":"b2c5ae1e8f3684b3fcae71bfb1139b5aa6cb0f7cde9ecfc57cd70eb958eea88b"} Apr 16 13:11:55.904261 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.904234 2571 generic.go:358] "Generic (PLEG): container finished" podID="8921cd74-6824-4c24-896a-5c649cefc5da" containerID="264dcd1e9fce797a5314f2f41bbfe62a2dbb416864a1e33e3c7b7948875688b2" exitCode=0 Apr 16 13:11:55.904342 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.904262 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerDied","Data":"264dcd1e9fce797a5314f2f41bbfe62a2dbb416864a1e33e3c7b7948875688b2"} Apr 16 13:11:55.906018 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.905996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-sxgjd" event={"ID":"38405315-7b9b-4c43-82bd-042c8486a193","Type":"ContainerStarted","Data":"607ec196d5ed154cacc493477ab34f788b5f4a1e72a0e815a6d04f7c21c1ed22"} Apr 16 13:11:55.924336 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.924294 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-tpklv" podStartSLOduration=4.116298548 podStartE2EDuration="20.924282679s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.243694306 +0000 UTC m=+3.126050380" lastFinishedPulling="2026-04-16 13:11:55.051678428 +0000 UTC m=+19.934034511" observedRunningTime="2026-04-16 13:11:55.907323845 +0000 UTC m=+20.789679936" watchObservedRunningTime="2026-04-16 13:11:55.924282679 +0000 UTC m=+20.806638765" Apr 16 13:11:55.924912 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.924880 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-sxgjd" podStartSLOduration=4.107428518 podStartE2EDuration="20.924855965s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.244146651 +0000 UTC m=+3.126502725" lastFinishedPulling="2026-04-16 13:11:55.061574089 +0000 UTC m=+19.943930172" observedRunningTime="2026-04-16 13:11:55.924232412 +0000 UTC m=+20.806588503" watchObservedRunningTime="2026-04-16 13:11:55.924855965 +0000 UTC m=+20.807212074" Apr 16 13:11:55.939950 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.939906 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qcgw5" podStartSLOduration=4.126686287 podStartE2EDuration="20.939893497s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.240711621 +0000 UTC m=+3.123067690" lastFinishedPulling="2026-04-16 13:11:55.053918817 +0000 UTC m=+19.936274900" observedRunningTime="2026-04-16 13:11:55.939638527 +0000 UTC m=+20.821994618" watchObservedRunningTime="2026-04-16 13:11:55.939893497 +0000 UTC m=+20.822249586" Apr 16 13:11:55.953697 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.953651 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-sprnh" podStartSLOduration=4.172154801 podStartE2EDuration="20.953635274s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.242229886 +0000 UTC m=+3.124585968" lastFinishedPulling="2026-04-16 13:11:55.023710356 +0000 UTC m=+19.906066441" observedRunningTime="2026-04-16 13:11:55.95291966 +0000 UTC m=+20.835275751" watchObservedRunningTime="2026-04-16 13:11:55.953635274 +0000 UTC m=+20.835991367" Apr 16 13:11:55.986585 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:55.986539 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-njx5f" podStartSLOduration=12.153850653 podStartE2EDuration="20.986522581s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.237121134 +0000 UTC m=+3.119477202" lastFinishedPulling="2026-04-16 13:11:47.069793059 +0000 UTC m=+11.952149130" observedRunningTime="2026-04-16 13:11:55.986251587 +0000 UTC m=+20.868607682" watchObservedRunningTime="2026-04-16 13:11:55.986522581 +0000 UTC m=+20.868878672" Apr 16 13:11:56.121043 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.121012 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:56.121635 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.121610 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:56.714007 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.713980 2571 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:11:56.720079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.720061 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:56.720175 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.720065 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:56.720175 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:56.720164 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:56.720294 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:56.720273 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:56.912609 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.912527 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" event={"ID":"5c95c4a9-7823-46cc-a24b-b1acc154ea64","Type":"ContainerStarted","Data":"9a5908901933bfa5e783c92b114e8bbfeaf648574cd88fd90207949663ff0bf2"} Apr 16 13:11:56.914026 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.913996 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-q87wh" event={"ID":"5094bf87-3c60-48ea-8c6b-c241aeb55c29","Type":"ContainerStarted","Data":"fc4938c506b94a45a0867b3db3552ee54d3f61aef9b43fe7271b4ea8d744c0eb"} Apr 16 13:11:56.914294 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.914274 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:56.914742 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.914725 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-tpklv" Apr 16 13:11:56.930320 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:56.930275 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-q87wh" podStartSLOduration=5.142295502 podStartE2EDuration="21.930261996s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.235741776 +0000 UTC m=+3.118097858" lastFinishedPulling="2026-04-16 13:11:55.023708279 +0000 UTC m=+19.906064352" observedRunningTime="2026-04-16 13:11:56.929725487 +0000 UTC m=+21.812081574" watchObservedRunningTime="2026-04-16 13:11:56.930261996 +0000 UTC m=+21.812618088" Apr 16 13:11:57.660800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:57.660714 2571 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:11:56.714003251Z","UUID":"e3321746-6979-4f00-b502-f89ed9dab9b3","Handler":null,"Name":"","Endpoint":""} Apr 16 13:11:57.664005 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:57.663980 2571 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:11:57.664005 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:57.664009 2571 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:11:57.720580 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:57.720551 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:57.720713 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:57.720674 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:11:58.720415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:58.720382 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:11:58.720839 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:58.720496 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:11:58.720839 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:58.720575 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:11:58.720839 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:58.720700 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:11:58.920697 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:58.920672 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:11:58.921133 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:58.921104 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"47fd091125cfdafe1e0b48d318b328e3df7d0152c1b3e11363684243af40e930"} Apr 16 13:11:58.923117 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:58.923094 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" event={"ID":"5c95c4a9-7823-46cc-a24b-b1acc154ea64","Type":"ContainerStarted","Data":"f8cef23516f7e534503548ae50fa7080cc4b34fbf7691e834bd410a7f17ba3c8"} Apr 16 13:11:58.939576 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:58.939535 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-922jk" podStartSLOduration=4.216953513 podStartE2EDuration="23.939524s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.242244848 +0000 UTC m=+3.124600928" lastFinishedPulling="2026-04-16 13:11:57.964815335 +0000 UTC m=+22.847171415" observedRunningTime="2026-04-16 13:11:58.939272177 +0000 UTC m=+23.821628270" watchObservedRunningTime="2026-04-16 13:11:58.939524 +0000 UTC m=+23.821880087" Apr 16 13:11:59.720671 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:11:59.720484 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:11:59.721114 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:11:59.720740 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:12:00.720518 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.720451 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:00.720651 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.720451 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:00.720651 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:00.720553 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:12:00.720651 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:00.720608 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:12:00.928476 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.928449 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:12:00.928780 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.928757 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"ae8fd07a6f6e8a9febc398d39d5d4d79a07e2fc2371861c701914e16589d0df2"} Apr 16 13:12:00.929108 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.929080 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:12:00.929108 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.929113 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:12:00.929233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.929205 2571 scope.go:117] "RemoveContainer" containerID="5a94b0548451bd976e4a7d7e5d883e9efd036304af02307ef8c5088dd4016ee1" Apr 16 13:12:00.930914 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.930889 2571 generic.go:358] "Generic (PLEG): container finished" podID="8921cd74-6824-4c24-896a-5c649cefc5da" containerID="387cc939615c48991449b2d42d27a315bd8911ce0fb766b900002254c712e3f7" exitCode=0 Apr 16 13:12:00.930999 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.930926 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerDied","Data":"387cc939615c48991449b2d42d27a315bd8911ce0fb766b900002254c712e3f7"} Apr 16 13:12:00.944285 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:00.944109 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:12:01.720180 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:01.720153 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:01.720313 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:01.720245 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:12:01.936262 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:01.936232 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:12:01.936657 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:01.936564 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" event={"ID":"c031fad3-56b2-4030-9fb7-11cd3421145d","Type":"ContainerStarted","Data":"ffcc6069d9a8acf5ab256a057dd00a4af39e8dc81edeb6955a52fc62cd9049b9"} Apr 16 13:12:01.936883 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:01.936850 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:12:01.950490 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:01.950472 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:12:01.961606 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:01.961568 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" podStartSLOduration=10.089419188 podStartE2EDuration="26.961557623s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.237742038 +0000 UTC m=+3.120098107" lastFinishedPulling="2026-04-16 13:11:55.109880468 +0000 UTC m=+19.992236542" observedRunningTime="2026-04-16 13:12:01.960121545 +0000 UTC m=+26.842477659" watchObservedRunningTime="2026-04-16 13:12:01.961557623 +0000 UTC m=+26.843913712" Apr 16 13:12:02.273557 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.273532 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-md8rb"] Apr 16 13:12:02.273665 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.273624 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:02.273731 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:02.273713 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:12:02.276882 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.276842 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h8fnx"] Apr 16 13:12:02.276996 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.276977 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:02.277110 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:02.277088 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:12:02.277545 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.277526 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wm9fg"] Apr 16 13:12:02.277636 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.277622 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:02.277715 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:02.277696 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:12:02.940488 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.940454 2571 generic.go:358] "Generic (PLEG): container finished" podID="8921cd74-6824-4c24-896a-5c649cefc5da" containerID="b738cc1ca144a2401e62815624cc0cd1e8f052b66822b87eaefdb85ad534fc7f" exitCode=0 Apr 16 13:12:02.940898 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:02.940535 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerDied","Data":"b738cc1ca144a2401e62815624cc0cd1e8f052b66822b87eaefdb85ad534fc7f"} Apr 16 13:12:03.719842 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:03.719814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:03.719994 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:03.719814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:03.719994 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:03.719932 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:12:03.720058 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:03.719814 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:03.720058 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:03.720007 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:12:03.720128 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:03.720076 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:12:04.946570 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:04.946536 2571 generic.go:358] "Generic (PLEG): container finished" podID="8921cd74-6824-4c24-896a-5c649cefc5da" containerID="e62e17b3f959295c6663f44648f576724bbaf904d34f96769f44e624ae169965" exitCode=0 Apr 16 13:12:04.947017 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:04.946605 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerDied","Data":"e62e17b3f959295c6663f44648f576724bbaf904d34f96769f44e624ae169965"} Apr 16 13:12:05.721324 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:05.721143 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:05.721490 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:05.721200 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:05.721490 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:05.721419 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:12:05.721611 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:05.721506 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:12:05.721611 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:05.721221 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:05.721706 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:05.721628 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:12:07.719607 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.719573 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:07.720070 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.719573 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:07.720070 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:07.719701 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-wm9fg" podUID="2db539fc-020c-4e5c-8585-e7acc930a358" Apr 16 13:12:07.720070 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.719724 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:07.720070 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:07.719821 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:12:07.720070 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:07.719920 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-md8rb" podUID="a9da276b-7d39-4d22-8b85-9d91a9a39f32" Apr 16 13:12:07.896358 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.896331 2571 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-141-234.ec2.internal" event="NodeReady" Apr 16 13:12:07.896498 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.896462 2571 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:12:07.936013 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.935989 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pcp5l"] Apr 16 13:12:07.960465 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.960440 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-rxwhc"] Apr 16 13:12:07.960642 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.960616 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:07.962914 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.962893 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkkx5\"" Apr 16 13:12:07.963036 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.962922 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:12:07.963036 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.962942 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:12:07.976577 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.976516 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pcp5l"] Apr 16 13:12:07.976577 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.976545 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rxwhc"] Apr 16 13:12:07.976713 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.976629 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:07.978726 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.978706 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrfd8\"" Apr 16 13:12:07.978833 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.978792 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:12:07.978933 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.978914 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:12:07.979000 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:07.978935 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:12:08.058396 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.058365 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-tmp-dir\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.058396 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.058401 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw4bb\" (UniqueName: \"kubernetes.io/projected/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-kube-api-access-vw4bb\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.058602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.058419 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.058602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.058449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-config-volume\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.159079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.159047 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vt8kq\" (UniqueName: \"kubernetes.io/projected/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-kube-api-access-vt8kq\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:08.159238 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.159153 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-tmp-dir\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.159238 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.159181 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw4bb\" (UniqueName: \"kubernetes.io/projected/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-kube-api-access-vw4bb\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.159238 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.159208 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:08.159238 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.159231 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.159440 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.159339 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:08.159440 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.159349 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-config-volume\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.159440 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.159396 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:12:08.659378013 +0000 UTC m=+33.541734096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:08.171482 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.171444 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-tmp-dir\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.171671 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.171648 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-config-volume\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.171854 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.171826 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw4bb\" (UniqueName: \"kubernetes.io/projected/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-kube-api-access-vw4bb\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.260092 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.260028 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:08.260092 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.260076 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vt8kq\" (UniqueName: \"kubernetes.io/projected/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-kube-api-access-vt8kq\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:08.260227 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.260165 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:08.260227 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.260218 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:12:08.760203935 +0000 UTC m=+33.642560003 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:08.268415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.268389 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vt8kq\" (UniqueName: \"kubernetes.io/projected/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-kube-api-access-vt8kq\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:08.662924 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.662843 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:08.663071 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.663002 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:08.663115 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.663070 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:12:09.66305102 +0000 UTC m=+34.545407113 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:08.763355 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:08.763314 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:08.764009 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.763484 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:08.764009 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:08.763556 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:12:09.763539177 +0000 UTC m=+34.645895246 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:09.368144 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.368107 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:09.368327 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.368238 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:12:09.368327 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.368300 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:12:41.368281372 +0000 UTC m=+66.250637462 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:12:09.469092 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.469036 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:09.469293 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.469208 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:12:09.469293 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.469235 2571 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:12:09.469293 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.469248 2571 projected.go:194] Error preparing data for projected volume kube-api-access-x2888 for pod openshift-network-diagnostics/network-check-target-wm9fg: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:12:09.469444 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.469312 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888 podName:2db539fc-020c-4e5c-8585-e7acc930a358 nodeName:}" failed. No retries permitted until 2026-04-16 13:12:41.469292508 +0000 UTC m=+66.351648579 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-x2888" (UniqueName: "kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888") pod "network-check-target-wm9fg" (UID: "2db539fc-020c-4e5c-8585-e7acc930a358") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:12:09.671157 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.671075 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:09.671329 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.671217 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:09.671329 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.671279 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:12:11.67126043 +0000 UTC m=+36.553616498 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:09.719915 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.719880 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:09.720086 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.719883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:09.720086 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.719883 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:09.723054 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.723027 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:12:09.724389 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.724365 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:12:09.725271 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.725249 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:12:09.725271 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.725262 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhs75\"" Apr 16 13:12:09.725426 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.725410 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:12:09.725555 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.725541 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hkf74\"" Apr 16 13:12:09.771556 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.771529 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:09.771894 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.771666 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:09.771894 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:09.771739 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:12:11.771719975 +0000 UTC m=+36.654076059 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:09.872730 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.872696 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:09.875139 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:09.875118 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/a9da276b-7d39-4d22-8b85-9d91a9a39f32-original-pull-secret\") pod \"global-pull-secret-syncer-md8rb\" (UID: \"a9da276b-7d39-4d22-8b85-9d91a9a39f32\") " pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:10.043117 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:10.043087 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-md8rb" Apr 16 13:12:10.633293 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:10.633241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-md8rb"] Apr 16 13:12:10.799304 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:12:10.799094 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9da276b_7d39_4d22_8b85_9d91a9a39f32.slice/crio-f131047d0abdc3c2a2a943879a2f255fc9ec836348083ac3f0f8d2cf4ef4c64c WatchSource:0}: Error finding container f131047d0abdc3c2a2a943879a2f255fc9ec836348083ac3f0f8d2cf4ef4c64c: Status 404 returned error can't find the container with id f131047d0abdc3c2a2a943879a2f255fc9ec836348083ac3f0f8d2cf4ef4c64c Apr 16 13:12:10.959202 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:10.959160 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-md8rb" event={"ID":"a9da276b-7d39-4d22-8b85-9d91a9a39f32","Type":"ContainerStarted","Data":"f131047d0abdc3c2a2a943879a2f255fc9ec836348083ac3f0f8d2cf4ef4c64c"} Apr 16 13:12:11.685909 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:11.685844 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:11.686087 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:11.686004 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:11.686145 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:11.686094 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:12:15.686072717 +0000 UTC m=+40.568428786 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:11.786608 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:11.786575 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:11.786762 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:11.786747 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:11.786823 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:11.786813 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:12:15.786795538 +0000 UTC m=+40.669151606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:11.964675 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:11.964592 2571 generic.go:358] "Generic (PLEG): container finished" podID="8921cd74-6824-4c24-896a-5c649cefc5da" containerID="9d65b571668828777c17b87215ac00fb76ec9cc7a5a8f61514cca407a8c812e6" exitCode=0 Apr 16 13:12:11.964675 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:11.964645 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerDied","Data":"9d65b571668828777c17b87215ac00fb76ec9cc7a5a8f61514cca407a8c812e6"} Apr 16 13:12:12.969611 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:12.969573 2571 generic.go:358] "Generic (PLEG): container finished" podID="8921cd74-6824-4c24-896a-5c649cefc5da" containerID="92cfb3343e624ad05b3bbc7de56d75496d24cbd7e3ed729c2d7b68d1922d9023" exitCode=0 Apr 16 13:12:12.970061 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:12.969623 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerDied","Data":"92cfb3343e624ad05b3bbc7de56d75496d24cbd7e3ed729c2d7b68d1922d9023"} Apr 16 13:12:14.976285 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:14.976252 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nrklc" event={"ID":"8921cd74-6824-4c24-896a-5c649cefc5da","Type":"ContainerStarted","Data":"85278252a6bafcc45d585712af3e4703914d2a053d6326f6d5e775fcaed6ce7a"} Apr 16 13:12:14.977475 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:14.977449 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-md8rb" event={"ID":"a9da276b-7d39-4d22-8b85-9d91a9a39f32","Type":"ContainerStarted","Data":"0b095b941c1bece7d83ab0947c8725326fc5b239d503c136514eef66c55093a3"} Apr 16 13:12:14.997697 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:14.997660 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nrklc" podStartSLOduration=7.39032014 podStartE2EDuration="39.997647387s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:11:38.23675212 +0000 UTC m=+3.119108201" lastFinishedPulling="2026-04-16 13:12:10.84407938 +0000 UTC m=+35.726435448" observedRunningTime="2026-04-16 13:12:14.996390157 +0000 UTC m=+39.878746247" watchObservedRunningTime="2026-04-16 13:12:14.997647387 +0000 UTC m=+39.880003476" Apr 16 13:12:15.717813 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:15.717784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:15.717977 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:15.717898 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:15.717977 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:15.717947 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:12:23.717931762 +0000 UTC m=+48.600287830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:15.818445 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:15.818412 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:15.818566 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:15.818541 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:15.818604 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:15.818592 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:12:23.818576213 +0000 UTC m=+48.700932283 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:23.765017 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:23.764983 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:23.765398 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:23.765077 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:23.765398 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:23.765138 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:12:39.765123254 +0000 UTC m=+64.647479322 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:23.865859 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:23.865829 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:23.865992 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:23.865973 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:23.866051 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:23.866040 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:12:39.866025325 +0000 UTC m=+64.748381397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:33.959214 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:33.959184 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-hdhfq" Apr 16 13:12:33.984534 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:33.984485 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-md8rb" podStartSLOduration=52.511756075 podStartE2EDuration="55.984472712s" podCreationTimestamp="2026-04-16 13:11:38 +0000 UTC" firstStartedPulling="2026-04-16 13:12:10.821701232 +0000 UTC m=+35.704057299" lastFinishedPulling="2026-04-16 13:12:14.294417854 +0000 UTC m=+39.176773936" observedRunningTime="2026-04-16 13:12:15.009123937 +0000 UTC m=+39.891480027" watchObservedRunningTime="2026-04-16 13:12:33.984472712 +0000 UTC m=+58.866828793" Apr 16 13:12:39.862061 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:39.862023 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:12:39.862458 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:39.862140 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:12:39.862458 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:39.862213 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:13:11.862198324 +0000 UTC m=+96.744554393 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:12:39.962520 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:39.962494 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:12:39.962601 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:39.962588 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:12:39.962660 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:39.962651 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:13:11.962638407 +0000 UTC m=+96.844994475 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:12:41.372399 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.372357 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:12:41.375368 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.375348 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:12:41.382443 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:41.382425 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:12:41.382504 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:12:41.382492 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:13:45.382473433 +0000 UTC m=+130.264829500 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : secret "metrics-daemon-secret" not found Apr 16 13:12:41.473395 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.473371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:41.476188 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.476172 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:12:41.486245 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.486227 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:12:41.497580 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.497555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2888\" (UniqueName: \"kubernetes.io/projected/2db539fc-020c-4e5c-8585-e7acc930a358-kube-api-access-x2888\") pod \"network-check-target-wm9fg\" (UID: \"2db539fc-020c-4e5c-8585-e7acc930a358\") " pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:41.536932 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.536905 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-vhs75\"" Apr 16 13:12:41.544830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.544812 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:41.666453 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:41.666412 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-wm9fg"] Apr 16 13:12:41.670252 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:12:41.670226 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2db539fc_020c_4e5c_8585_e7acc930a358.slice/crio-49c5165c53ef2bc39fdce29a5a47bad5ae304ba71663818a383abc52a4445517 WatchSource:0}: Error finding container 49c5165c53ef2bc39fdce29a5a47bad5ae304ba71663818a383abc52a4445517: Status 404 returned error can't find the container with id 49c5165c53ef2bc39fdce29a5a47bad5ae304ba71663818a383abc52a4445517 Apr 16 13:12:42.026239 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:42.026211 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wm9fg" event={"ID":"2db539fc-020c-4e5c-8585-e7acc930a358","Type":"ContainerStarted","Data":"49c5165c53ef2bc39fdce29a5a47bad5ae304ba71663818a383abc52a4445517"} Apr 16 13:12:45.034025 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:45.033913 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-wm9fg" event={"ID":"2db539fc-020c-4e5c-8585-e7acc930a358","Type":"ContainerStarted","Data":"95a5dab27593667b771369ce1433db8e2c320f6260c338a549fc5d36dea4810b"} Apr 16 13:12:45.034401 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:45.034068 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:12:45.048361 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:12:45.048258 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-wm9fg" podStartSLOduration=67.416472093 podStartE2EDuration="1m10.04824264s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:12:41.672024858 +0000 UTC m=+66.554380927" lastFinishedPulling="2026-04-16 13:12:44.303795407 +0000 UTC m=+69.186151474" observedRunningTime="2026-04-16 13:12:45.047885552 +0000 UTC m=+69.930241646" watchObservedRunningTime="2026-04-16 13:12:45.04824264 +0000 UTC m=+69.930598731" Apr 16 13:13:11.872822 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:11.872790 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:13:11.873193 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:11.872926 2571 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:13:11.873193 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:11.872988 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls podName:ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc nodeName:}" failed. No retries permitted until 2026-04-16 13:14:15.872974101 +0000 UTC m=+160.755330169 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls") pod "dns-default-pcp5l" (UID: "ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc") : secret "dns-default-metrics-tls" not found Apr 16 13:13:11.973396 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:11.973368 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:13:11.973529 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:11.973485 2571 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:13:11.973572 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:11.973539 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert podName:e3e13a4d-74db-43e1-b7d8-cddf502adb4c nodeName:}" failed. No retries permitted until 2026-04-16 13:14:15.973525933 +0000 UTC m=+160.855882001 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert") pod "ingress-canary-rxwhc" (UID: "e3e13a4d-74db-43e1-b7d8-cddf502adb4c") : secret "canary-serving-cert" not found Apr 16 13:13:16.038378 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:16.038343 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-wm9fg" Apr 16 13:13:45.386547 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:45.386497 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:13:45.387070 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:45.386651 2571 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:13:45.387070 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:45.386767 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs podName:f57b15af-9441-4822-9c41-048d94ab4c1a nodeName:}" failed. No retries permitted until 2026-04-16 13:15:47.386748694 +0000 UTC m=+252.269104766 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs") pod "network-metrics-daemon-h8fnx" (UID: "f57b15af-9441-4822-9c41-048d94ab4c1a") : secret "metrics-daemon-secret" not found Apr 16 13:13:46.617802 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.617771 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5"] Apr 16 13:13:46.619603 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.619588 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" Apr 16 13:13:46.621443 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.621422 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-68f85f86cb-pvhn6"] Apr 16 13:13:46.623145 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.623131 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.623347 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.623327 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-krrf8\"" Apr 16 13:13:46.626157 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626136 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 13:13:46.626269 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626138 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-xtc7p\"" Apr 16 13:13:46.626269 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626163 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 13:13:46.626269 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626171 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.626776 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626573 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 13:13:46.626776 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626657 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.626978 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.626962 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 13:13:46.632855 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.632836 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5"] Apr 16 13:13:46.645905 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.645887 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68f85f86cb-pvhn6"] Apr 16 13:13:46.696078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.696050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-stats-auth\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.696078 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.696077 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.696213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.696116 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-default-certificate\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.696213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.696169 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.696213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.696185 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7js7r\" (UniqueName: \"kubernetes.io/projected/e8211e71-ff44-4db8-b401-fc031e0edd43-kube-api-access-7js7r\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.696213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.696209 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw7tg\" (UniqueName: \"kubernetes.io/projected/ecd3ccde-366f-4e98-b3d9-51342f39e2ba-kube-api-access-zw7tg\") pod \"network-check-source-7b678d77c7-vjtz5\" (UID: \"ecd3ccde-366f-4e98-b3d9-51342f39e2ba\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" Apr 16 13:13:46.722236 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.722210 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj"] Apr 16 13:13:46.724150 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.724133 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46"] Apr 16 13:13:46.724280 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.724264 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.725769 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.725752 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xdltb"] Apr 16 13:13:46.725904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.725887 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:46.727014 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.726996 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 13:13:46.728029 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.728013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.728898 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.728877 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.728992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.728898 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 16 13:13:46.729050 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.728998 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.729158 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.729143 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.730737 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.730321 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-7j27v\"" Apr 16 13:13:46.730737 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.730328 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-chjqn\"" Apr 16 13:13:46.730737 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.730425 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 13:13:46.730737 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.730551 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.731461 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.731442 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 13:13:46.731723 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.731707 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 13:13:46.731794 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.731707 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2sxxw\"" Apr 16 13:13:46.731794 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.731711 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 13:13:46.732155 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.732141 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 13:13:46.737895 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.737844 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46"] Apr 16 13:13:46.738100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.738084 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 13:13:46.738900 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.738880 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xdltb"] Apr 16 13:13:46.740986 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.740967 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj"] Apr 16 13:13:46.796516 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796494 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/55e8eb8e-280c-46c9-bcfc-5d796a915163-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.796639 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796522 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-serving-cert\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.796639 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796550 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-stats-auth\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.796639 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796592 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.796639 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796620 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:46.796820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796646 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.796820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796693 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz9mp\" (UniqueName: \"kubernetes.io/projected/55e8eb8e-280c-46c9-bcfc-5d796a915163-kube-api-access-wz9mp\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.796820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796737 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.796820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796762 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-snapshots\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.796820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796791 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-default-certificate\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.797068 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.796819 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.797068 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.796943 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:13:46.797068 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.797006 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:47.296989249 +0000 UTC m=+132.179345325 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : secret "router-metrics-certs-default" not found Apr 16 13:13:46.797068 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.797003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.797068 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.797030 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7js7r\" (UniqueName: \"kubernetes.io/projected/e8211e71-ff44-4db8-b401-fc031e0edd43-kube-api-access-7js7r\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.797068 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.797049 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdj59\" (UniqueName: \"kubernetes.io/projected/21154974-4853-4774-bc79-f3089c1c3161-kube-api-access-mdj59\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:46.797352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.797072 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-tmp\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.797352 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.797113 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:47.297096123 +0000 UTC m=+132.179452229 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : configmap references non-existent config key: service-ca.crt Apr 16 13:13:46.797352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.797141 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw7tg\" (UniqueName: \"kubernetes.io/projected/ecd3ccde-366f-4e98-b3d9-51342f39e2ba-kube-api-access-zw7tg\") pod \"network-check-source-7b678d77c7-vjtz5\" (UID: \"ecd3ccde-366f-4e98-b3d9-51342f39e2ba\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" Apr 16 13:13:46.797352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.797177 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5c9s\" (UniqueName: \"kubernetes.io/projected/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-kube-api-access-m5c9s\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.799117 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.799097 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-stats-auth\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.799200 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.799119 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-default-certificate\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.807119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.807099 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7js7r\" (UniqueName: \"kubernetes.io/projected/e8211e71-ff44-4db8-b401-fc031e0edd43-kube-api-access-7js7r\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:46.807519 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.807500 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw7tg\" (UniqueName: \"kubernetes.io/projected/ecd3ccde-366f-4e98-b3d9-51342f39e2ba-kube-api-access-zw7tg\") pod \"network-check-source-7b678d77c7-vjtz5\" (UID: \"ecd3ccde-366f-4e98-b3d9-51342f39e2ba\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" Apr 16 13:13:46.897937 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.897887 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5c9s\" (UniqueName: \"kubernetes.io/projected/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-kube-api-access-m5c9s\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.897937 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.897916 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/55e8eb8e-280c-46c9-bcfc-5d796a915163-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.897937 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.897933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-serving-cert\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.898084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.897969 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:46.898084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.897986 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.898084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898014 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz9mp\" (UniqueName: \"kubernetes.io/projected/55e8eb8e-280c-46c9-bcfc-5d796a915163-kube-api-access-wz9mp\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.898084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898056 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.898084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898082 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-snapshots\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.898298 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.898102 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:13:46.898298 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.898163 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls podName:21154974-4853-4774-bc79-f3089c1c3161 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:47.398145843 +0000 UTC m=+132.280501930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls") pod "cluster-samples-operator-667775844f-dqx46" (UID: "21154974-4853-4774-bc79-f3089c1c3161") : secret "samples-operator-tls" not found Apr 16 13:13:46.898441 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.898417 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:46.898556 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898459 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.898556 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:46.898473 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls podName:55e8eb8e-280c-46c9-bcfc-5d796a915163 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:47.398456296 +0000 UTC m=+132.280812365 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7qknj" (UID: "55e8eb8e-280c-46c9-bcfc-5d796a915163") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:46.898670 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898602 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mdj59\" (UniqueName: \"kubernetes.io/projected/21154974-4853-4774-bc79-f3089c1c3161-kube-api-access-mdj59\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:46.898670 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898632 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-tmp\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.898776 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/55e8eb8e-280c-46c9-bcfc-5d796a915163-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.898776 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898700 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-snapshots\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.898977 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.898957 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-tmp\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.899033 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.899010 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.899033 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.899018 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.900461 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.900436 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-serving-cert\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.906538 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.906520 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5c9s\" (UniqueName: \"kubernetes.io/projected/4634cc1d-d3ff-44bd-9edd-28902d1bbd65-kube-api-access-m5c9s\") pod \"insights-operator-5785d4fcdd-xdltb\" (UID: \"4634cc1d-d3ff-44bd-9edd-28902d1bbd65\") " pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:46.906851 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.906830 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdj59\" (UniqueName: \"kubernetes.io/projected/21154974-4853-4774-bc79-f3089c1c3161-kube-api-access-mdj59\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:46.906915 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.906898 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz9mp\" (UniqueName: \"kubernetes.io/projected/55e8eb8e-280c-46c9-bcfc-5d796a915163-kube-api-access-wz9mp\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:46.929614 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:46.929597 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" Apr 16 13:13:47.039440 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.039410 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5"] Apr 16 13:13:47.042458 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:13:47.042434 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podecd3ccde_366f_4e98_b3d9_51342f39e2ba.slice/crio-d2d15804d1b8efbb6160a3cc9c21fc18006cf34bffae30759d4cd669a5572ed8 WatchSource:0}: Error finding container d2d15804d1b8efbb6160a3cc9c21fc18006cf34bffae30759d4cd669a5572ed8: Status 404 returned error can't find the container with id d2d15804d1b8efbb6160a3cc9c21fc18006cf34bffae30759d4cd669a5572ed8 Apr 16 13:13:47.048211 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.048195 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" Apr 16 13:13:47.151042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.150963 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" event={"ID":"ecd3ccde-366f-4e98-b3d9-51342f39e2ba","Type":"ContainerStarted","Data":"b37eb0532896cfeb8444951cc9179abe7d1731902a3812f3f039a5a250e67c0c"} Apr 16 13:13:47.151042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.151000 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" event={"ID":"ecd3ccde-366f-4e98-b3d9-51342f39e2ba","Type":"ContainerStarted","Data":"d2d15804d1b8efbb6160a3cc9c21fc18006cf34bffae30759d4cd669a5572ed8"} Apr 16 13:13:47.179963 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.179919 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-vjtz5" podStartSLOduration=1.17990435 podStartE2EDuration="1.17990435s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:13:47.179318037 +0000 UTC m=+132.061674139" watchObservedRunningTime="2026-04-16 13:13:47.17990435 +0000 UTC m=+132.062260439" Apr 16 13:13:47.181281 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.181185 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-xdltb"] Apr 16 13:13:47.183586 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:13:47.183560 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4634cc1d_d3ff_44bd_9edd_28902d1bbd65.slice/crio-dc4693f95c5aa88181da360fd1f2134ba84f111ee820e195393f62fd9568ca54 WatchSource:0}: Error finding container dc4693f95c5aa88181da360fd1f2134ba84f111ee820e195393f62fd9568ca54: Status 404 returned error can't find the container with id dc4693f95c5aa88181da360fd1f2134ba84f111ee820e195393f62fd9568ca54 Apr 16 13:13:47.301710 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.301689 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:47.301809 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.301738 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:47.301890 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.301819 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:13:47.301890 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.301856 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:48.301845135 +0000 UTC m=+133.184201204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : secret "router-metrics-certs-default" not found Apr 16 13:13:47.302017 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.301936 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:48.301920239 +0000 UTC m=+133.184276311 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : configmap references non-existent config key: service-ca.crt Apr 16 13:13:47.403100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.403055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:47.403196 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:47.403121 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:47.403250 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.403204 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:13:47.403250 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.403210 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:47.403250 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.403242 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls podName:21154974-4853-4774-bc79-f3089c1c3161 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:48.403231836 +0000 UTC m=+133.285587904 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls") pod "cluster-samples-operator-667775844f-dqx46" (UID: "21154974-4853-4774-bc79-f3089c1c3161") : secret "samples-operator-tls" not found Apr 16 13:13:47.403366 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:47.403263 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls podName:55e8eb8e-280c-46c9-bcfc-5d796a915163 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:48.403247405 +0000 UTC m=+133.285603474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7qknj" (UID: "55e8eb8e-280c-46c9-bcfc-5d796a915163") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:48.154556 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:48.154521 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" event={"ID":"4634cc1d-d3ff-44bd-9edd-28902d1bbd65","Type":"ContainerStarted","Data":"dc4693f95c5aa88181da360fd1f2134ba84f111ee820e195393f62fd9568ca54"} Apr 16 13:13:48.310504 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:48.310467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:48.310676 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:48.310520 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:48.310676 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.310648 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:50.310626053 +0000 UTC m=+135.192982138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : configmap references non-existent config key: service-ca.crt Apr 16 13:13:48.310676 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.310670 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:13:48.310853 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.310721 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:50.310707606 +0000 UTC m=+135.193063674 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : secret "router-metrics-certs-default" not found Apr 16 13:13:48.411359 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:48.411286 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:48.411485 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:48.411406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:48.411485 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.411414 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:48.411485 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.411462 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls podName:55e8eb8e-280c-46c9-bcfc-5d796a915163 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:50.411448022 +0000 UTC m=+135.293804089 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7qknj" (UID: "55e8eb8e-280c-46c9-bcfc-5d796a915163") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:48.411598 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.411522 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:13:48.411598 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:48.411574 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls podName:21154974-4853-4774-bc79-f3089c1c3161 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:50.41155877 +0000 UTC m=+135.293914860 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls") pod "cluster-samples-operator-667775844f-dqx46" (UID: "21154974-4853-4774-bc79-f3089c1c3161") : secret "samples-operator-tls" not found Apr 16 13:13:50.159486 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:50.159446 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" event={"ID":"4634cc1d-d3ff-44bd-9edd-28902d1bbd65","Type":"ContainerStarted","Data":"f64d233a82abbc5d82ada7bfef645ba8acec8e79cc00f72c9b8c5e0cef707df4"} Apr 16 13:13:50.175100 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:50.175057 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" podStartSLOduration=2.025196808 podStartE2EDuration="4.175044068s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:13:47.185628689 +0000 UTC m=+132.067984760" lastFinishedPulling="2026-04-16 13:13:49.335475936 +0000 UTC m=+134.217832020" observedRunningTime="2026-04-16 13:13:50.173777986 +0000 UTC m=+135.056134080" watchObservedRunningTime="2026-04-16 13:13:50.175044068 +0000 UTC m=+135.057400158" Apr 16 13:13:50.326234 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:50.326208 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:50.326382 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:50.326305 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:50.326382 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.326363 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:13:50.326473 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.326409 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:54.326393222 +0000 UTC m=+139.208749292 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : configmap references non-existent config key: service-ca.crt Apr 16 13:13:50.326473 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.326424 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:54.326418132 +0000 UTC m=+139.208774200 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : secret "router-metrics-certs-default" not found Apr 16 13:13:50.427291 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:50.427228 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:50.427403 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:50.427304 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:50.427403 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.427313 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:50.427403 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.427362 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls podName:55e8eb8e-280c-46c9-bcfc-5d796a915163 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:54.427347712 +0000 UTC m=+139.309703780 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7qknj" (UID: "55e8eb8e-280c-46c9-bcfc-5d796a915163") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:50.427403 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.427393 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:13:50.427546 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:50.427453 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls podName:21154974-4853-4774-bc79-f3089c1c3161 nodeName:}" failed. No retries permitted until 2026-04-16 13:13:54.427440827 +0000 UTC m=+139.309796900 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls") pod "cluster-samples-operator-667775844f-dqx46" (UID: "21154974-4853-4774-bc79-f3089c1c3161") : secret "samples-operator-tls" not found Apr 16 13:13:52.829031 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:52.829001 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sprnh_446804f0-271f-479a-89e6-b8b25ec2e701/dns-node-resolver/0.log" Apr 16 13:13:53.629564 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:53.629541 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-njx5f_03556f6f-6803-432c-8012-c40eb6e388ad/node-ca/0.log" Apr 16 13:13:54.263955 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.263916 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-4tx2q"] Apr 16 13:13:54.266065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.266047 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.268466 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.268439 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 13:13:54.268466 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.268448 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-pzdhs\"" Apr 16 13:13:54.269502 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.269485 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 13:13:54.269587 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.269542 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 13:13:54.269587 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.269542 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 13:13:54.274220 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.274193 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-4tx2q"] Apr 16 13:13:54.356108 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.356080 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:54.356240 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.356126 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8mvj\" (UniqueName: \"kubernetes.io/projected/3033bdee-eba8-4cd0-acc5-ac04892d7471-kube-api-access-b8mvj\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.356240 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.356162 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:13:54.356240 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.356219 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3033bdee-eba8-4cd0-acc5-ac04892d7471-signing-key\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.356361 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.356245 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:02.356227544 +0000 UTC m=+147.238583612 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : configmap references non-existent config key: service-ca.crt Apr 16 13:13:54.356361 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.356325 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:13:54.356450 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.356390 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:02.356374492 +0000 UTC m=+147.238730560 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : secret "router-metrics-certs-default" not found Apr 16 13:13:54.356450 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.356410 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3033bdee-eba8-4cd0-acc5-ac04892d7471-signing-cabundle\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.457419 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.457393 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b8mvj\" (UniqueName: \"kubernetes.io/projected/3033bdee-eba8-4cd0-acc5-ac04892d7471-kube-api-access-b8mvj\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.457542 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.457433 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:13:54.457542 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.457467 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3033bdee-eba8-4cd0-acc5-ac04892d7471-signing-key\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.457542 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.457484 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3033bdee-eba8-4cd0-acc5-ac04892d7471-signing-cabundle\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.457542 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.457511 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:13:54.457808 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.457551 2571 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 16 13:13:54.457808 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.457612 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls podName:21154974-4853-4774-bc79-f3089c1c3161 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:02.457595452 +0000 UTC m=+147.339951520 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls") pod "cluster-samples-operator-667775844f-dqx46" (UID: "21154974-4853-4774-bc79-f3089c1c3161") : secret "samples-operator-tls" not found Apr 16 13:13:54.457808 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.457627 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:54.457808 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:13:54.457675 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls podName:55e8eb8e-280c-46c9-bcfc-5d796a915163 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:02.457661526 +0000 UTC m=+147.340017593 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7qknj" (UID: "55e8eb8e-280c-46c9-bcfc-5d796a915163") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:13:54.458205 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.458187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/3033bdee-eba8-4cd0-acc5-ac04892d7471-signing-cabundle\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.459949 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.459932 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/3033bdee-eba8-4cd0-acc5-ac04892d7471-signing-key\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.468665 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.468639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8mvj\" (UniqueName: \"kubernetes.io/projected/3033bdee-eba8-4cd0-acc5-ac04892d7471-kube-api-access-b8mvj\") pod \"service-ca-bfc587fb7-4tx2q\" (UID: \"3033bdee-eba8-4cd0-acc5-ac04892d7471\") " pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.574948 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.574895 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" Apr 16 13:13:54.687422 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:54.687394 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-4tx2q"] Apr 16 13:13:54.690593 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:13:54.690560 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3033bdee_eba8_4cd0_acc5_ac04892d7471.slice/crio-3cc078ee78566793a9735efeb04eebe7af8fec0fc7a1d5a0f25cc12b4676e137 WatchSource:0}: Error finding container 3cc078ee78566793a9735efeb04eebe7af8fec0fc7a1d5a0f25cc12b4676e137: Status 404 returned error can't find the container with id 3cc078ee78566793a9735efeb04eebe7af8fec0fc7a1d5a0f25cc12b4676e137 Apr 16 13:13:55.169820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:55.169785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" event={"ID":"3033bdee-eba8-4cd0-acc5-ac04892d7471","Type":"ContainerStarted","Data":"3cc078ee78566793a9735efeb04eebe7af8fec0fc7a1d5a0f25cc12b4676e137"} Apr 16 13:13:57.175071 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:57.175003 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" event={"ID":"3033bdee-eba8-4cd0-acc5-ac04892d7471","Type":"ContainerStarted","Data":"ae8a8ed54147c86b3d0563b0d619a7be8748662306eb5beb4687aec5626236b8"} Apr 16 13:13:57.198415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:13:57.198366 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-4tx2q" podStartSLOduration=1.034269756 podStartE2EDuration="3.198352801s" podCreationTimestamp="2026-04-16 13:13:54 +0000 UTC" firstStartedPulling="2026-04-16 13:13:54.692423495 +0000 UTC m=+139.574779563" lastFinishedPulling="2026-04-16 13:13:56.856506537 +0000 UTC m=+141.738862608" observedRunningTime="2026-04-16 13:13:57.19665137 +0000 UTC m=+142.079007460" watchObservedRunningTime="2026-04-16 13:13:57.198352801 +0000 UTC m=+142.080708891" Apr 16 13:14:02.422146 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.422115 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:02.422557 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.422164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:02.422557 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:02.422281 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:18.422261876 +0000 UTC m=+163.304617953 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : configmap references non-existent config key: service-ca.crt Apr 16 13:14:02.422557 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:02.422303 2571 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 13:14:02.422557 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:02.422337 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs podName:e8211e71-ff44-4db8-b401-fc031e0edd43 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:18.422327029 +0000 UTC m=+163.304683112 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs") pod "router-default-68f85f86cb-pvhn6" (UID: "e8211e71-ff44-4db8-b401-fc031e0edd43") : secret "router-metrics-certs-default" not found Apr 16 13:14:02.522923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.522881 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:14:02.523069 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.523008 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:14:02.523069 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:02.523023 2571 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 13:14:02.523154 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:02.523092 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls podName:55e8eb8e-280c-46c9-bcfc-5d796a915163 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:18.523072361 +0000 UTC m=+163.405428434 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7qknj" (UID: "55e8eb8e-280c-46c9-bcfc-5d796a915163") : secret "cluster-monitoring-operator-tls" not found Apr 16 13:14:02.525363 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.525340 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/21154974-4853-4774-bc79-f3089c1c3161-samples-operator-tls\") pod \"cluster-samples-operator-667775844f-dqx46\" (UID: \"21154974-4853-4774-bc79-f3089c1c3161\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:14:02.642591 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.642565 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" Apr 16 13:14:02.763930 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:02.763896 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46"] Apr 16 13:14:03.190506 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:03.190476 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" event={"ID":"21154974-4853-4774-bc79-f3089c1c3161","Type":"ContainerStarted","Data":"3468fd82eb0897b7d8ce0d1f5d2b0344974bc31809c2a1a528eccc8de0e58ee9"} Apr 16 13:14:05.196677 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:05.196644 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" event={"ID":"21154974-4853-4774-bc79-f3089c1c3161","Type":"ContainerStarted","Data":"8a02ca6f4c56f6daaa0241ec8df25f6e4043cc1dc5c9b83875291a5ddf0220cc"} Apr 16 13:14:05.196677 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:05.196679 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" event={"ID":"21154974-4853-4774-bc79-f3089c1c3161","Type":"ContainerStarted","Data":"5c34a2b6448a0a27c91c8fef4395e3e9d83e2a11a2a0672f3c80e17e11b6d6c2"} Apr 16 13:14:05.215880 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:05.215817 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-667775844f-dqx46" podStartSLOduration=17.50452734 podStartE2EDuration="19.215804159s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:14:02.803822668 +0000 UTC m=+147.686178751" lastFinishedPulling="2026-04-16 13:14:04.515099502 +0000 UTC m=+149.397455570" observedRunningTime="2026-04-16 13:14:05.214913344 +0000 UTC m=+150.097269436" watchObservedRunningTime="2026-04-16 13:14:05.215804159 +0000 UTC m=+150.098160240" Apr 16 13:14:10.971622 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:10.971584 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-pcp5l" podUID="ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc" Apr 16 13:14:10.986758 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:10.986719 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-rxwhc" podUID="e3e13a4d-74db-43e1-b7d8-cddf502adb4c" Apr 16 13:14:11.209548 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:11.209521 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pcp5l" Apr 16 13:14:12.748574 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:12.748523 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-h8fnx" podUID="f57b15af-9441-4822-9c41-048d94ab4c1a" Apr 16 13:14:13.979737 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.979704 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f7h7d"] Apr 16 13:14:13.981673 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.981659 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:13.983907 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.983884 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 13:14:13.984277 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.984258 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 13:14:13.985043 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.985026 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-mhnxk\"" Apr 16 13:14:13.992286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.992265 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f7h7d"] Apr 16 13:14:13.997350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.997327 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-79ddbc568d-wlbcc"] Apr 16 13:14:13.999171 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:13.999157 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.002013 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.001996 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-sxsq4\"" Apr 16 13:14:14.002094 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.002044 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:14:14.002327 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.002313 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:14:14.003098 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.003082 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:14:14.007110 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.007094 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:14:14.013447 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.013423 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79ddbc568d-wlbcc"] Apr 16 13:14:14.106935 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.106908 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ae4434f3-4c5d-49c8-b89e-d926620456ca-crio-socket\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.107037 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.106960 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-bound-sa-token\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107088 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107067 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71d038c0-6e92-41d6-bb20-ac23854174ca-installation-pull-secrets\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107130 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae4434f3-4c5d-49c8-b89e-d926620456ca-data-volume\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.107130 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107117 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae4434f3-4c5d-49c8-b89e-d926620456ca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.107210 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107138 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-registry-tls\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107248 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107205 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvmrq\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-kube-api-access-qvmrq\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107248 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107234 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr2bj\" (UniqueName: \"kubernetes.io/projected/ae4434f3-4c5d-49c8-b89e-d926620456ca-kube-api-access-sr2bj\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.107327 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107250 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ae4434f3-4c5d-49c8-b89e-d926620456ca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.107327 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107266 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71d038c0-6e92-41d6-bb20-ac23854174ca-ca-trust-extracted\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107327 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107283 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71d038c0-6e92-41d6-bb20-ac23854174ca-registry-certificates\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107327 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71d038c0-6e92-41d6-bb20-ac23854174ca-trusted-ca\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.107457 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.107358 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71d038c0-6e92-41d6-bb20-ac23854174ca-image-registry-private-configuration\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.207770 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.207746 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71d038c0-6e92-41d6-bb20-ac23854174ca-installation-pull-secrets\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.207894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.207774 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae4434f3-4c5d-49c8-b89e-d926620456ca-data-volume\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.207894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.207792 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae4434f3-4c5d-49c8-b89e-d926620456ca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.207894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.207809 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-registry-tls\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.207894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.207838 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvmrq\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-kube-api-access-qvmrq\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.207894 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.207891 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sr2bj\" (UniqueName: \"kubernetes.io/projected/ae4434f3-4c5d-49c8-b89e-d926620456ca-kube-api-access-sr2bj\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.208144 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208049 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ae4434f3-4c5d-49c8-b89e-d926620456ca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.208144 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208088 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71d038c0-6e92-41d6-bb20-ac23854174ca-ca-trust-extracted\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.208144 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208117 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71d038c0-6e92-41d6-bb20-ac23854174ca-registry-certificates\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.208283 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208164 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71d038c0-6e92-41d6-bb20-ac23854174ca-trusted-ca\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.208283 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ae4434f3-4c5d-49c8-b89e-d926620456ca-data-volume\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.208498 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208455 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71d038c0-6e92-41d6-bb20-ac23854174ca-image-registry-private-configuration\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.208565 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208505 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ae4434f3-4c5d-49c8-b89e-d926620456ca-crio-socket\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.208565 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-bound-sa-token\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.208565 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/71d038c0-6e92-41d6-bb20-ac23854174ca-ca-trust-extracted\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.208884 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208831 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ae4434f3-4c5d-49c8-b89e-d926620456ca-crio-socket\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.208884 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.208852 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ae4434f3-4c5d-49c8-b89e-d926620456ca-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.209358 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.209333 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/71d038c0-6e92-41d6-bb20-ac23854174ca-registry-certificates\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.209586 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.209565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/71d038c0-6e92-41d6-bb20-ac23854174ca-trusted-ca\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.210505 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.210476 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ae4434f3-4c5d-49c8-b89e-d926620456ca-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.210600 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.210484 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-registry-tls\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.210600 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.210556 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/71d038c0-6e92-41d6-bb20-ac23854174ca-installation-pull-secrets\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.210889 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.210857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/71d038c0-6e92-41d6-bb20-ac23854174ca-image-registry-private-configuration\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.215925 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.215903 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-bound-sa-token\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.216209 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.216186 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr2bj\" (UniqueName: \"kubernetes.io/projected/ae4434f3-4c5d-49c8-b89e-d926620456ca-kube-api-access-sr2bj\") pod \"insights-runtime-extractor-f7h7d\" (UID: \"ae4434f3-4c5d-49c8-b89e-d926620456ca\") " pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.216380 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.216362 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvmrq\" (UniqueName: \"kubernetes.io/projected/71d038c0-6e92-41d6-bb20-ac23854174ca-kube-api-access-qvmrq\") pod \"image-registry-79ddbc568d-wlbcc\" (UID: \"71d038c0-6e92-41d6-bb20-ac23854174ca\") " pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.290954 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.290908 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f7h7d" Apr 16 13:14:14.307535 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.307507 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:14.417858 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.417834 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f7h7d"] Apr 16 13:14:14.421600 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:14.421573 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae4434f3_4c5d_49c8_b89e_d926620456ca.slice/crio-e92be9d412591387a236d3b08861607e11a33ebc1d1b63c8038b8a62d4497ba5 WatchSource:0}: Error finding container e92be9d412591387a236d3b08861607e11a33ebc1d1b63c8038b8a62d4497ba5: Status 404 returned error can't find the container with id e92be9d412591387a236d3b08861607e11a33ebc1d1b63c8038b8a62d4497ba5 Apr 16 13:14:14.433546 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:14.433484 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-79ddbc568d-wlbcc"] Apr 16 13:14:14.435603 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:14.435582 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71d038c0_6e92_41d6_bb20_ac23854174ca.slice/crio-a1585b6cee22959d340269d347aa820691e57b4f451d4de3c42aa2a1bb3c7a3d WatchSource:0}: Error finding container a1585b6cee22959d340269d347aa820691e57b4f451d4de3c42aa2a1bb3c7a3d: Status 404 returned error can't find the container with id a1585b6cee22959d340269d347aa820691e57b4f451d4de3c42aa2a1bb3c7a3d Apr 16 13:14:15.219545 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.219514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" event={"ID":"71d038c0-6e92-41d6-bb20-ac23854174ca","Type":"ContainerStarted","Data":"9b9cf75402666e0961d4cc1429b871e7a7ba8f04b736697887c8b31e1e86378a"} Apr 16 13:14:15.219931 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.219551 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" event={"ID":"71d038c0-6e92-41d6-bb20-ac23854174ca","Type":"ContainerStarted","Data":"a1585b6cee22959d340269d347aa820691e57b4f451d4de3c42aa2a1bb3c7a3d"} Apr 16 13:14:15.219931 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.219608 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:15.220988 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.220968 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f7h7d" event={"ID":"ae4434f3-4c5d-49c8-b89e-d926620456ca","Type":"ContainerStarted","Data":"7ca39ed06aa22b6dc727fbdad79ed1d9755bfdb8c2c5e6acfade3882a9ebcc7d"} Apr 16 13:14:15.221065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.220994 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f7h7d" event={"ID":"ae4434f3-4c5d-49c8-b89e-d926620456ca","Type":"ContainerStarted","Data":"81585c93de78cebef95e3ad616d4fc43b556dd70406adc85c3943d9ca31c3856"} Apr 16 13:14:15.221065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.221004 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f7h7d" event={"ID":"ae4434f3-4c5d-49c8-b89e-d926620456ca","Type":"ContainerStarted","Data":"e92be9d412591387a236d3b08861607e11a33ebc1d1b63c8038b8a62d4497ba5"} Apr 16 13:14:15.240623 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.240584 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" podStartSLOduration=2.240567934 podStartE2EDuration="2.240567934s" podCreationTimestamp="2026-04-16 13:14:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:14:15.238703363 +0000 UTC m=+160.121059453" watchObservedRunningTime="2026-04-16 13:14:15.240567934 +0000 UTC m=+160.122924025" Apr 16 13:14:15.921998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.921962 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:14:15.924878 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:15.924835 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc-metrics-tls\") pod \"dns-default-pcp5l\" (UID: \"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc\") " pod="openshift-dns/dns-default-pcp5l" Apr 16 13:14:16.012530 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:16.012503 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xkkx5\"" Apr 16 13:14:16.020591 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:16.020568 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pcp5l" Apr 16 13:14:16.022390 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:16.022371 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:14:16.024926 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:16.024904 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e3e13a4d-74db-43e1-b7d8-cddf502adb4c-cert\") pod \"ingress-canary-rxwhc\" (UID: \"e3e13a4d-74db-43e1-b7d8-cddf502adb4c\") " pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:14:16.146712 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:16.146677 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pcp5l"] Apr 16 13:14:16.529717 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:16.529682 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded02d9f0_3e2f_4d74_bdd4_e407126d8bbc.slice/crio-cd2785b62e22a5b4c49183c6a3100a5ca5d3bf17713d90a864cb2f16371cc482 WatchSource:0}: Error finding container cd2785b62e22a5b4c49183c6a3100a5ca5d3bf17713d90a864cb2f16371cc482: Status 404 returned error can't find the container with id cd2785b62e22a5b4c49183c6a3100a5ca5d3bf17713d90a864cb2f16371cc482 Apr 16 13:14:17.227798 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:17.227760 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pcp5l" event={"ID":"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc","Type":"ContainerStarted","Data":"cd2785b62e22a5b4c49183c6a3100a5ca5d3bf17713d90a864cb2f16371cc482"} Apr 16 13:14:17.229853 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:17.229820 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f7h7d" event={"ID":"ae4434f3-4c5d-49c8-b89e-d926620456ca","Type":"ContainerStarted","Data":"c0909012bbf6bbcdf6b8f8ceeee69174829f4a5107ecb1bd8a3fc874427da265"} Apr 16 13:14:17.250037 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:17.249994 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f7h7d" podStartSLOduration=2.173306092 podStartE2EDuration="4.249979615s" podCreationTimestamp="2026-04-16 13:14:13 +0000 UTC" firstStartedPulling="2026-04-16 13:14:14.477589324 +0000 UTC m=+159.359945399" lastFinishedPulling="2026-04-16 13:14:16.554262837 +0000 UTC m=+161.436618922" observedRunningTime="2026-04-16 13:14:17.248951896 +0000 UTC m=+162.131308005" watchObservedRunningTime="2026-04-16 13:14:17.249979615 +0000 UTC m=+162.132335711" Apr 16 13:14:18.235172 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.235137 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pcp5l" event={"ID":"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc","Type":"ContainerStarted","Data":"333fe0ed9a1abfca972882dc20b61e65121557a168977d3879d70ada78c0221e"} Apr 16 13:14:18.235172 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.235174 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pcp5l" event={"ID":"ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc","Type":"ContainerStarted","Data":"3e68d8d6a3d35052672d80f62c3a8ffff1f952814af5f9ee686f016ee38044d0"} Apr 16 13:14:18.235578 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.235278 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pcp5l" Apr 16 13:14:18.251084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.251032 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pcp5l" podStartSLOduration=130.02416549 podStartE2EDuration="2m11.251021123s" podCreationTimestamp="2026-04-16 13:12:07 +0000 UTC" firstStartedPulling="2026-04-16 13:14:16.531626419 +0000 UTC m=+161.413982487" lastFinishedPulling="2026-04-16 13:14:17.758482051 +0000 UTC m=+162.640838120" observedRunningTime="2026-04-16 13:14:18.2510209 +0000 UTC m=+163.133376990" watchObservedRunningTime="2026-04-16 13:14:18.251021123 +0000 UTC m=+163.133377209" Apr 16 13:14:18.454791 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.454766 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:18.454931 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.454807 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:18.455323 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.455298 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8211e71-ff44-4db8-b401-fc031e0edd43-service-ca-bundle\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:18.456981 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.456958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e8211e71-ff44-4db8-b401-fc031e0edd43-metrics-certs\") pod \"router-default-68f85f86cb-pvhn6\" (UID: \"e8211e71-ff44-4db8-b401-fc031e0edd43\") " pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:18.555243 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.555193 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:14:18.557435 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.557413 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/55e8eb8e-280c-46c9-bcfc-5d796a915163-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7qknj\" (UID: \"55e8eb8e-280c-46c9-bcfc-5d796a915163\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:14:18.735105 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.735083 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:18.834212 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.834140 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" Apr 16 13:14:18.847925 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.847902 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-68f85f86cb-pvhn6"] Apr 16 13:14:18.850805 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:18.850781 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8211e71_ff44_4db8_b401_fc031e0edd43.slice/crio-4d8fd1a3fa1836cb60ad185387c3a96a6032168eae5fa7d867f419c7052becda WatchSource:0}: Error finding container 4d8fd1a3fa1836cb60ad185387c3a96a6032168eae5fa7d867f419c7052becda: Status 404 returned error can't find the container with id 4d8fd1a3fa1836cb60ad185387c3a96a6032168eae5fa7d867f419c7052becda Apr 16 13:14:18.950129 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:18.950078 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj"] Apr 16 13:14:18.953378 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:18.953351 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55e8eb8e_280c_46c9_bcfc_5d796a915163.slice/crio-cfaa79d43a2c9b455d23c58e075eac8db50cffbf97e738c8a4c18fd91204b5da WatchSource:0}: Error finding container cfaa79d43a2c9b455d23c58e075eac8db50cffbf97e738c8a4c18fd91204b5da: Status 404 returned error can't find the container with id cfaa79d43a2c9b455d23c58e075eac8db50cffbf97e738c8a4c18fd91204b5da Apr 16 13:14:19.237934 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:19.237901 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" event={"ID":"55e8eb8e-280c-46c9-bcfc-5d796a915163","Type":"ContainerStarted","Data":"cfaa79d43a2c9b455d23c58e075eac8db50cffbf97e738c8a4c18fd91204b5da"} Apr 16 13:14:19.239134 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:19.239110 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" event={"ID":"e8211e71-ff44-4db8-b401-fc031e0edd43","Type":"ContainerStarted","Data":"bfb9551f07d3fc7bb9092f580f10b61d3f2ca0246f627fa515a7d78dcf7bcfa9"} Apr 16 13:14:19.239241 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:19.239141 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" event={"ID":"e8211e71-ff44-4db8-b401-fc031e0edd43","Type":"ContainerStarted","Data":"4d8fd1a3fa1836cb60ad185387c3a96a6032168eae5fa7d867f419c7052becda"} Apr 16 13:14:19.257491 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:19.257445 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" podStartSLOduration=33.257430041 podStartE2EDuration="33.257430041s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:14:19.25623443 +0000 UTC m=+164.138590521" watchObservedRunningTime="2026-04-16 13:14:19.257430041 +0000 UTC m=+164.139786131" Apr 16 13:14:19.736076 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:19.736047 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:19.738561 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:19.738538 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:20.241900 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:20.241862 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:20.243259 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:20.243238 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-68f85f86cb-pvhn6" Apr 16 13:14:21.251189 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:21.251152 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" event={"ID":"55e8eb8e-280c-46c9-bcfc-5d796a915163","Type":"ContainerStarted","Data":"e67f4b8cbaee7baaa94a8e7abfc44f56abd82b6a7f979b48aca8cbbb6eaf7e2a"} Apr 16 13:14:21.266079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:21.266032 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7qknj" podStartSLOduration=33.614879826 podStartE2EDuration="35.266019435s" podCreationTimestamp="2026-04-16 13:13:46 +0000 UTC" firstStartedPulling="2026-04-16 13:14:18.95536618 +0000 UTC m=+163.837722261" lastFinishedPulling="2026-04-16 13:14:20.606505791 +0000 UTC m=+165.488861870" observedRunningTime="2026-04-16 13:14:21.265061665 +0000 UTC m=+166.147417756" watchObservedRunningTime="2026-04-16 13:14:21.266019435 +0000 UTC m=+166.148375524" Apr 16 13:14:25.140189 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.140152 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-29686"] Apr 16 13:14:25.143784 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.143767 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.146440 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.146418 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 13:14:25.146551 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.146424 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 13:14:25.147574 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.147554 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-dth2s\"" Apr 16 13:14:25.147653 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.147612 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 13:14:25.149314 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.149286 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-29686"] Apr 16 13:14:25.197098 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.197072 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cncsk\" (UniqueName: \"kubernetes.io/projected/a6cd6279-f079-4e59-a738-c74e05d8556d-kube-api-access-cncsk\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.197206 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.197129 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.197206 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.197191 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.197316 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.197254 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6cd6279-f079-4e59-a738-c74e05d8556d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.297817 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.297793 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.297936 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.297824 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.297936 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.297851 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6cd6279-f079-4e59-a738-c74e05d8556d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.297936 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.297892 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cncsk\" (UniqueName: \"kubernetes.io/projected/a6cd6279-f079-4e59-a738-c74e05d8556d-kube-api-access-cncsk\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.298056 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:25.297956 2571 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 13:14:25.298056 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:25.298018 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-tls podName:a6cd6279-f079-4e59-a738-c74e05d8556d nodeName:}" failed. No retries permitted until 2026-04-16 13:14:25.79800369 +0000 UTC m=+170.680359758 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-tls") pod "prometheus-operator-78f957474d-29686" (UID: "a6cd6279-f079-4e59-a738-c74e05d8556d") : secret "prometheus-operator-tls" not found Apr 16 13:14:25.298477 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.298449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a6cd6279-f079-4e59-a738-c74e05d8556d-metrics-client-ca\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.300306 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.300284 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.306577 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.306559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cncsk\" (UniqueName: \"kubernetes.io/projected/a6cd6279-f079-4e59-a738-c74e05d8556d-kube-api-access-cncsk\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.723569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.723537 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:14:25.726651 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.726626 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-vrfd8\"" Apr 16 13:14:25.734157 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.734137 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-rxwhc" Apr 16 13:14:25.802649 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.802613 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.805663 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.805449 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/a6cd6279-f079-4e59-a738-c74e05d8556d-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-29686\" (UID: \"a6cd6279-f079-4e59-a738-c74e05d8556d\") " pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:25.851346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:25.851320 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-rxwhc"] Apr 16 13:14:25.854260 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:25.854234 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3e13a4d_74db_43e1_b7d8_cddf502adb4c.slice/crio-d521a64979989e318b02f0ac63148b2ad8874b8b1a1f77b7591feedc8ec6ae26 WatchSource:0}: Error finding container d521a64979989e318b02f0ac63148b2ad8874b8b1a1f77b7591feedc8ec6ae26: Status 404 returned error can't find the container with id d521a64979989e318b02f0ac63148b2ad8874b8b1a1f77b7591feedc8ec6ae26 Apr 16 13:14:26.054250 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:26.054196 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-29686" Apr 16 13:14:26.164487 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:26.164453 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-29686"] Apr 16 13:14:26.167533 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:26.167501 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6cd6279_f079_4e59_a738_c74e05d8556d.slice/crio-160be6b835f71bf4e372a444fb60af7798c4057b85ae5eea4324ec24353419a1 WatchSource:0}: Error finding container 160be6b835f71bf4e372a444fb60af7798c4057b85ae5eea4324ec24353419a1: Status 404 returned error can't find the container with id 160be6b835f71bf4e372a444fb60af7798c4057b85ae5eea4324ec24353419a1 Apr 16 13:14:26.264123 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:26.264080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-29686" event={"ID":"a6cd6279-f079-4e59-a738-c74e05d8556d","Type":"ContainerStarted","Data":"160be6b835f71bf4e372a444fb60af7798c4057b85ae5eea4324ec24353419a1"} Apr 16 13:14:26.265014 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:26.264990 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rxwhc" event={"ID":"e3e13a4d-74db-43e1-b7d8-cddf502adb4c","Type":"ContainerStarted","Data":"d521a64979989e318b02f0ac63148b2ad8874b8b1a1f77b7591feedc8ec6ae26"} Apr 16 13:14:27.723809 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:27.723785 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:14:28.241324 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:28.241299 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pcp5l" Apr 16 13:14:28.272353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:28.272316 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-29686" event={"ID":"a6cd6279-f079-4e59-a738-c74e05d8556d","Type":"ContainerStarted","Data":"d260f12ef73b3c7d5f21d6c8bf699e234c6103929435e1751abb13c94951cdf2"} Apr 16 13:14:28.272353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:28.272352 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-29686" event={"ID":"a6cd6279-f079-4e59-a738-c74e05d8556d","Type":"ContainerStarted","Data":"606fc813b5db2d0485125470ccb26665ca2942949adb76c31f258cd906b49d5d"} Apr 16 13:14:28.273795 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:28.273765 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-rxwhc" event={"ID":"e3e13a4d-74db-43e1-b7d8-cddf502adb4c","Type":"ContainerStarted","Data":"aef212120d7a45e030b62c425c7fee3824e41fa0e272a07ef657277ccb3b3ba9"} Apr 16 13:14:28.289520 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:28.289464 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-29686" podStartSLOduration=1.683711298 podStartE2EDuration="3.289445835s" podCreationTimestamp="2026-04-16 13:14:25 +0000 UTC" firstStartedPulling="2026-04-16 13:14:26.169233027 +0000 UTC m=+171.051589095" lastFinishedPulling="2026-04-16 13:14:27.774967564 +0000 UTC m=+172.657323632" observedRunningTime="2026-04-16 13:14:28.287799101 +0000 UTC m=+173.170155191" watchObservedRunningTime="2026-04-16 13:14:28.289445835 +0000 UTC m=+173.171801927" Apr 16 13:14:28.302415 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:28.302371 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-rxwhc" podStartSLOduration=139.383460583 podStartE2EDuration="2m21.302358803s" podCreationTimestamp="2026-04-16 13:12:07 +0000 UTC" firstStartedPulling="2026-04-16 13:14:25.85610602 +0000 UTC m=+170.738462088" lastFinishedPulling="2026-04-16 13:14:27.775004235 +0000 UTC m=+172.657360308" observedRunningTime="2026-04-16 13:14:28.301175336 +0000 UTC m=+173.183531426" watchObservedRunningTime="2026-04-16 13:14:28.302358803 +0000 UTC m=+173.184714935" Apr 16 13:14:30.499217 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.499183 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-bpckk"] Apr 16 13:14:30.502959 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.502938 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.505474 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.505446 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 13:14:30.506470 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.506453 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 13:14:30.506620 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.506599 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 13:14:30.506620 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.506611 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-sfwsd\"" Apr 16 13:14:30.512213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.512189 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gvcnb"] Apr 16 13:14:30.515426 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.515407 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-bpckk"] Apr 16 13:14:30.515584 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.515533 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.518208 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.518192 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 13:14:30.518338 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.518249 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 13:14:30.518914 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.518641 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 13:14:30.518914 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.518712 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2q4p2\"" Apr 16 13:14:30.541105 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541074 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541242 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541223 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-metrics-client-ca\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541387 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfblr\" (UniqueName: \"kubernetes.io/projected/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-api-access-lfblr\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.541602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541482 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1f804495-e6ac-4249-a4ec-add4bb87962d-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.541602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541515 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-textfile\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljwn5\" (UniqueName: \"kubernetes.io/projected/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-kube-api-access-ljwn5\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541590 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-root\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541664 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541692 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541721 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-accelerators-collector-config\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541754 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541780 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-sys\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.541830 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541807 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f804495-e6ac-4249-a4ec-add4bb87962d-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.542122 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.541833 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-wtmp\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642390 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-metrics-client-ca\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642515 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642407 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfblr\" (UniqueName: \"kubernetes.io/projected/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-api-access-lfblr\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.642515 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642441 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1f804495-e6ac-4249-a4ec-add4bb87962d-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.642515 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642466 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-textfile\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642515 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642500 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ljwn5\" (UniqueName: \"kubernetes.io/projected/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-kube-api-access-ljwn5\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642530 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642555 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-root\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642604 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.642699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642631 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.642699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642659 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-accelerators-collector-config\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.642699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.643063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642715 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-sys\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.643063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642741 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f804495-e6ac-4249-a4ec-add4bb87962d-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.643063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642768 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-wtmp\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.643063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642800 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.643063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642834 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/1f804495-e6ac-4249-a4ec-add4bb87962d-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.643063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.642973 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-metrics-client-ca\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.643370 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.643334 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-textfile\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.644074 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.643467 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-accelerators-collector-config\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.644074 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.643536 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-root\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.644074 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.643600 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-wtmp\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.644074 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.643667 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-sys\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.644074 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:30.643901 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 13:14:30.644074 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:30.643966 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls podName:6fe9255b-fcb4-4c6a-a54b-0f8295617b96 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:31.14394621 +0000 UTC m=+176.026302278 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls") pod "node-exporter-gvcnb" (UID: "6fe9255b-fcb4-4c6a-a54b-0f8295617b96") : secret "node-exporter-tls" not found Apr 16 13:14:30.644421 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.644246 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1f804495-e6ac-4249-a4ec-add4bb87962d-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.644421 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.644335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.646001 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.645977 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.646603 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.646559 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.646770 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.646749 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.666538 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.666517 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfblr\" (UniqueName: \"kubernetes.io/projected/1f804495-e6ac-4249-a4ec-add4bb87962d-kube-api-access-lfblr\") pod \"kube-state-metrics-7479c89684-bpckk\" (UID: \"1f804495-e6ac-4249-a4ec-add4bb87962d\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.669433 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.669407 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljwn5\" (UniqueName: \"kubernetes.io/projected/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-kube-api-access-ljwn5\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:30.814715 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.814652 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" Apr 16 13:14:30.952665 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:30.952639 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-bpckk"] Apr 16 13:14:30.955544 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:30.955515 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f804495_e6ac_4249_a4ec_add4bb87962d.slice/crio-3eabb7cb1fc0c47a32df3ca0af3c7aa4c83dd53dadc01b8850f9f715e80a7a5a WatchSource:0}: Error finding container 3eabb7cb1fc0c47a32df3ca0af3c7aa4c83dd53dadc01b8850f9f715e80a7a5a: Status 404 returned error can't find the container with id 3eabb7cb1fc0c47a32df3ca0af3c7aa4c83dd53dadc01b8850f9f715e80a7a5a Apr 16 13:14:31.147201 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:31.147130 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:31.147319 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:31.147248 2571 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 13:14:31.147319 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:14:31.147309 2571 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls podName:6fe9255b-fcb4-4c6a-a54b-0f8295617b96 nodeName:}" failed. No retries permitted until 2026-04-16 13:14:32.147293233 +0000 UTC m=+177.029649300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls") pod "node-exporter-gvcnb" (UID: "6fe9255b-fcb4-4c6a-a54b-0f8295617b96") : secret "node-exporter-tls" not found Apr 16 13:14:31.286738 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:31.286702 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" event={"ID":"1f804495-e6ac-4249-a4ec-add4bb87962d","Type":"ContainerStarted","Data":"3eabb7cb1fc0c47a32df3ca0af3c7aa4c83dd53dadc01b8850f9f715e80a7a5a"} Apr 16 13:14:32.160090 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:32.160041 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:32.162315 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:32.162290 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/6fe9255b-fcb4-4c6a-a54b-0f8295617b96-node-exporter-tls\") pod \"node-exporter-gvcnb\" (UID: \"6fe9255b-fcb4-4c6a-a54b-0f8295617b96\") " pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:32.290688 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:32.290664 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" event={"ID":"1f804495-e6ac-4249-a4ec-add4bb87962d","Type":"ContainerStarted","Data":"ab3b60b3afb04bb676340d0807e88b55acb428cb3d405efd3d49a83c6fb90318"} Apr 16 13:14:32.327391 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:32.327035 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gvcnb" Apr 16 13:14:33.295314 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:33.295231 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" event={"ID":"1f804495-e6ac-4249-a4ec-add4bb87962d","Type":"ContainerStarted","Data":"ec4f3d0a89b8ab9296e3d7511a3a9327f913aa98030ff47faf2d4801ae96869b"} Apr 16 13:14:33.295314 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:33.295273 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" event={"ID":"1f804495-e6ac-4249-a4ec-add4bb87962d","Type":"ContainerStarted","Data":"e4a531b62dfa23bcb84389c4baba2b5e60bff7611beb4647689208f057e545c0"} Apr 16 13:14:33.296606 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:33.296584 2571 generic.go:358] "Generic (PLEG): container finished" podID="6fe9255b-fcb4-4c6a-a54b-0f8295617b96" containerID="b1e9b7d708ad1801533483e8f57b346a74a8a708484e09090e1cacfbee0503bd" exitCode=0 Apr 16 13:14:33.296696 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:33.296654 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gvcnb" event={"ID":"6fe9255b-fcb4-4c6a-a54b-0f8295617b96","Type":"ContainerDied","Data":"b1e9b7d708ad1801533483e8f57b346a74a8a708484e09090e1cacfbee0503bd"} Apr 16 13:14:33.296696 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:33.296684 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gvcnb" event={"ID":"6fe9255b-fcb4-4c6a-a54b-0f8295617b96","Type":"ContainerStarted","Data":"a4c45c65da35373db86ed3daa54ed67ee53262ee51b116eb8fdef5d7111ad98e"} Apr 16 13:14:33.317682 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:33.317641 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-bpckk" podStartSLOduration=2.068629521 podStartE2EDuration="3.31762886s" podCreationTimestamp="2026-04-16 13:14:30 +0000 UTC" firstStartedPulling="2026-04-16 13:14:30.957725697 +0000 UTC m=+175.840081769" lastFinishedPulling="2026-04-16 13:14:32.206725028 +0000 UTC m=+177.089081108" observedRunningTime="2026-04-16 13:14:33.316217023 +0000 UTC m=+178.198573150" watchObservedRunningTime="2026-04-16 13:14:33.31762886 +0000 UTC m=+178.199984950" Apr 16 13:14:34.301705 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.301665 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gvcnb" event={"ID":"6fe9255b-fcb4-4c6a-a54b-0f8295617b96","Type":"ContainerStarted","Data":"3384d7a468e9bd09314948ff5058030c02f41d8b537d91f87cec781ec7d45d89"} Apr 16 13:14:34.301705 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.301708 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gvcnb" event={"ID":"6fe9255b-fcb4-4c6a-a54b-0f8295617b96","Type":"ContainerStarted","Data":"507c7dbe6e68f8986dbfa135bac48ae641d1cf8ec60d3d0ca8ee8d7c39f1fea2"} Apr 16 13:14:34.318283 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.318226 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gvcnb" podStartSLOduration=3.6334958950000003 podStartE2EDuration="4.318207089s" podCreationTimestamp="2026-04-16 13:14:30 +0000 UTC" firstStartedPulling="2026-04-16 13:14:32.341470793 +0000 UTC m=+177.223826864" lastFinishedPulling="2026-04-16 13:14:33.026181985 +0000 UTC m=+177.908538058" observedRunningTime="2026-04-16 13:14:34.317949442 +0000 UTC m=+179.200305534" watchObservedRunningTime="2026-04-16 13:14:34.318207089 +0000 UTC m=+179.200563183" Apr 16 13:14:34.937637 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.937596 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6685444475-7kmjj"] Apr 16 13:14:34.940936 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.940911 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.943835 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.943811 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 13:14:34.943984 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.943971 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 13:14:34.945032 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.945014 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 13:14:34.945120 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.945048 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9ljq9prar5psq\"" Apr 16 13:14:34.945176 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.945020 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 13:14:34.945176 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.945022 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-vt426\"" Apr 16 13:14:34.948059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.948030 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6685444475-7kmjj"] Apr 16 13:14:34.990412 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990385 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-client-ca-bundle\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.990575 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990443 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-secret-metrics-server-client-certs\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.990575 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990481 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-secret-metrics-server-tls\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.990575 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990500 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cqtn\" (UniqueName: \"kubernetes.io/projected/135cc650-07e2-4bdb-9c50-421e8792e403-kube-api-access-9cqtn\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.990575 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990515 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/135cc650-07e2-4bdb-9c50-421e8792e403-audit-log\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.990575 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990565 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/135cc650-07e2-4bdb-9c50-421e8792e403-metrics-server-audit-profiles\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:34.990781 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:34.990636 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/135cc650-07e2-4bdb-9c50-421e8792e403-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.091748 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.091720 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cqtn\" (UniqueName: \"kubernetes.io/projected/135cc650-07e2-4bdb-9c50-421e8792e403-kube-api-access-9cqtn\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.091908 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.091756 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/135cc650-07e2-4bdb-9c50-421e8792e403-audit-log\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.091908 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.091782 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/135cc650-07e2-4bdb-9c50-421e8792e403-metrics-server-audit-profiles\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092014 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.091990 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/135cc650-07e2-4bdb-9c50-421e8792e403-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.092042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-client-ca-bundle\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092163 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.092145 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-secret-metrics-server-client-certs\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.092187 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/135cc650-07e2-4bdb-9c50-421e8792e403-audit-log\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.092197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-secret-metrics-server-tls\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092628 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.092592 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/135cc650-07e2-4bdb-9c50-421e8792e403-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.092812 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.092792 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/135cc650-07e2-4bdb-9c50-421e8792e403-metrics-server-audit-profiles\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.094518 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.094497 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-secret-metrics-server-tls\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.094589 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.094530 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-secret-metrics-server-client-certs\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.094589 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.094584 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/135cc650-07e2-4bdb-9c50-421e8792e403-client-ca-bundle\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.100221 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.100200 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cqtn\" (UniqueName: \"kubernetes.io/projected/135cc650-07e2-4bdb-9c50-421e8792e403-kube-api-access-9cqtn\") pod \"metrics-server-6685444475-7kmjj\" (UID: \"135cc650-07e2-4bdb-9c50-421e8792e403\") " pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.252029 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.251922 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:35.394488 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.394450 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6685444475-7kmjj"] Apr 16 13:14:35.398172 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:35.398144 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135cc650_07e2_4bdb_9c50_421e8792e403.slice/crio-566bdb0eab8673fa97ecd67f6597304dfd5b8c9fa691104eb17d907388e33f29 WatchSource:0}: Error finding container 566bdb0eab8673fa97ecd67f6597304dfd5b8c9fa691104eb17d907388e33f29: Status 404 returned error can't find the container with id 566bdb0eab8673fa97ecd67f6597304dfd5b8c9fa691104eb17d907388e33f29 Apr 16 13:14:35.692771 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.692744 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6945cfd779-2pgcb"] Apr 16 13:14:35.697029 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.697013 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.699502 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.699464 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-cs2z9\"" Apr 16 13:14:35.699621 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.699540 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 13:14:35.699621 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.699547 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 13:14:35.699621 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.699591 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 13:14:35.699769 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.699630 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 13:14:35.699769 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.699673 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 13:14:35.704083 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.704057 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 13:14:35.706288 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.706269 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6945cfd779-2pgcb"] Apr 16 13:14:35.798090 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798066 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798211 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798097 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-secret-telemeter-client\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798211 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798127 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-telemeter-client-tls\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798211 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798147 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798232 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-metrics-client-ca\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798262 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-serving-certs-ca-bundle\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-federate-client-tls\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.798439 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.798350 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5ctn\" (UniqueName: \"kubernetes.io/projected/11525182-7b22-4a16-9a14-6bcdc9289b9d-kube-api-access-l5ctn\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899379 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899337 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-serving-certs-ca-bundle\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899379 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899374 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-federate-client-tls\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899573 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899406 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5ctn\" (UniqueName: \"kubernetes.io/projected/11525182-7b22-4a16-9a14-6bcdc9289b9d-kube-api-access-l5ctn\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899712 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899683 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899781 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899737 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-secret-telemeter-client\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899776 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-telemeter-client-tls\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899811 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.899998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.899898 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-metrics-client-ca\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.900330 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.900304 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-serving-certs-ca-bundle\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.900576 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.900553 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-metrics-client-ca\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.900719 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.900665 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11525182-7b22-4a16-9a14-6bcdc9289b9d-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.902219 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.902195 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-federate-client-tls\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.902508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.902486 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-secret-telemeter-client\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.902819 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.902800 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-telemeter-client-tls\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.903309 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.903291 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/11525182-7b22-4a16-9a14-6bcdc9289b9d-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:35.908593 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:35.908565 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5ctn\" (UniqueName: \"kubernetes.io/projected/11525182-7b22-4a16-9a14-6bcdc9289b9d-kube-api-access-l5ctn\") pod \"telemeter-client-6945cfd779-2pgcb\" (UID: \"11525182-7b22-4a16-9a14-6bcdc9289b9d\") " pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:36.006130 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.006064 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" Apr 16 13:14:36.149984 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.149958 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6945cfd779-2pgcb"] Apr 16 13:14:36.152654 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:36.152615 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11525182_7b22_4a16_9a14_6bcdc9289b9d.slice/crio-0c8e34d51aa886e40d4379f10f40530dc1e5fa700af3e1509722a20b204c2bc1 WatchSource:0}: Error finding container 0c8e34d51aa886e40d4379f10f40530dc1e5fa700af3e1509722a20b204c2bc1: Status 404 returned error can't find the container with id 0c8e34d51aa886e40d4379f10f40530dc1e5fa700af3e1509722a20b204c2bc1 Apr 16 13:14:36.228645 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.228617 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-79ddbc568d-wlbcc" Apr 16 13:14:36.309984 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.309904 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" event={"ID":"135cc650-07e2-4bdb-9c50-421e8792e403","Type":"ContainerStarted","Data":"566bdb0eab8673fa97ecd67f6597304dfd5b8c9fa691104eb17d907388e33f29"} Apr 16 13:14:36.311230 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.311200 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" event={"ID":"11525182-7b22-4a16-9a14-6bcdc9289b9d","Type":"ContainerStarted","Data":"0c8e34d51aa886e40d4379f10f40530dc1e5fa700af3e1509722a20b204c2bc1"} Apr 16 13:14:36.747123 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.745923 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:14:36.749980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.749951 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.752572 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.752630 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.752795 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.752854 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-49uk17uc5i1ir\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.753021 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.752574 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qqm9p\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.753102 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.753290 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.753309 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 13:14:36.753923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.753789 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 13:14:36.755047 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.754624 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 13:14:36.755047 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.754846 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 13:14:36.760063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.759013 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 13:14:36.765978 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.764210 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 13:14:36.767623 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.767599 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:14:36.807307 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807285 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807378 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807353 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807435 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807385 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-web-config\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807435 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807414 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807534 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807447 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807534 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807524 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config-out\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807630 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807556 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807630 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807589 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807630 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807616 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807799 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807641 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807799 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807678 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807799 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807708 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807799 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.807799 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807775 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.808052 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807817 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.808052 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807840 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.808052 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807895 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.808052 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.807922 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jwfsk\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-kube-api-access-jwfsk\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.908712 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.908686 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.908837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.908724 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.908930 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.908896 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.908986 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.908944 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.908986 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.908981 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909091 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909003 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jwfsk\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-kube-api-access-jwfsk\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909091 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909052 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909189 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909102 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909189 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909126 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-web-config\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909189 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909156 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909191 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909254 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config-out\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909281 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909313 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909332 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909338 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909643 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909361 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909643 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909398 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909643 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909431 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.909643 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.909543 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.910930 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.910495 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.910930 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.910625 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.913375 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.913037 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.914492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.913526 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.914492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.913819 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.914492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.913857 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.914492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.914248 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.914492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.914445 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.914492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.914453 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.915352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.915323 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config-out\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.915352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.915324 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.915574 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.915324 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.915663 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.915643 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.915730 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.915682 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-web-config\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.916434 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.916400 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:36.917151 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:36.917137 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jwfsk\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-kube-api-access-jwfsk\") pod \"prometheus-k8s-0\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:37.066619 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:37.066530 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:37.236850 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:37.236825 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:14:37.238237 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:14:37.238214 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8e59d14_0f39_46aa_9acb_073f7dc47e12.slice/crio-05fff3f2f959e2b932287fd7b3b6bb1c35983ef803211ea5a871bcf2327fd33c WatchSource:0}: Error finding container 05fff3f2f959e2b932287fd7b3b6bb1c35983ef803211ea5a871bcf2327fd33c: Status 404 returned error can't find the container with id 05fff3f2f959e2b932287fd7b3b6bb1c35983ef803211ea5a871bcf2327fd33c Apr 16 13:14:37.315729 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:37.315649 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" event={"ID":"135cc650-07e2-4bdb-9c50-421e8792e403","Type":"ContainerStarted","Data":"94b5e587944d99d782fc2b52fb8635f0779ef01981ec16fddb4411096fe8120f"} Apr 16 13:14:37.316804 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:37.316735 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"05fff3f2f959e2b932287fd7b3b6bb1c35983ef803211ea5a871bcf2327fd33c"} Apr 16 13:14:37.332075 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:37.332027 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" podStartSLOduration=1.933456005 podStartE2EDuration="3.332013932s" podCreationTimestamp="2026-04-16 13:14:34 +0000 UTC" firstStartedPulling="2026-04-16 13:14:35.40077017 +0000 UTC m=+180.283126238" lastFinishedPulling="2026-04-16 13:14:36.799328082 +0000 UTC m=+181.681684165" observedRunningTime="2026-04-16 13:14:37.331351102 +0000 UTC m=+182.213707194" watchObservedRunningTime="2026-04-16 13:14:37.332013932 +0000 UTC m=+182.214370021" Apr 16 13:14:38.321266 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:38.321226 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" exitCode=0 Apr 16 13:14:38.321701 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:38.321326 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} Apr 16 13:14:39.325304 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:39.325266 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" event={"ID":"11525182-7b22-4a16-9a14-6bcdc9289b9d","Type":"ContainerStarted","Data":"c982c8385dc31080adc15248b0ea78344e8cfce75d50f889b59ab27faacd8525"} Apr 16 13:14:39.325304 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:39.325306 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" event={"ID":"11525182-7b22-4a16-9a14-6bcdc9289b9d","Type":"ContainerStarted","Data":"c014e52298c8386782864109adb7ba25c2494568b0ebd30dfc86fda2566cb2f3"} Apr 16 13:14:39.325663 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:39.325318 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" event={"ID":"11525182-7b22-4a16-9a14-6bcdc9289b9d","Type":"ContainerStarted","Data":"7a381b740147c49f5d5955e5e3782341ec55a198842a60d7b7a6bb742e2ee74c"} Apr 16 13:14:39.346998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:39.346940 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6945cfd779-2pgcb" podStartSLOduration=1.8422622720000001 podStartE2EDuration="4.346922682s" podCreationTimestamp="2026-04-16 13:14:35 +0000 UTC" firstStartedPulling="2026-04-16 13:14:36.154887912 +0000 UTC m=+181.037243983" lastFinishedPulling="2026-04-16 13:14:38.659548322 +0000 UTC m=+183.541904393" observedRunningTime="2026-04-16 13:14:39.346062466 +0000 UTC m=+184.228418588" watchObservedRunningTime="2026-04-16 13:14:39.346922682 +0000 UTC m=+184.229278773" Apr 16 13:14:43.339272 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:43.339231 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} Apr 16 13:14:43.339272 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:43.339272 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} Apr 16 13:14:45.347783 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:45.347748 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} Apr 16 13:14:45.347783 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:45.347782 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} Apr 16 13:14:45.347783 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:45.347792 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} Apr 16 13:14:45.348224 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:45.347800 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerStarted","Data":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} Apr 16 13:14:45.374502 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:45.374416 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.17418209 podStartE2EDuration="9.374403264s" podCreationTimestamp="2026-04-16 13:14:36 +0000 UTC" firstStartedPulling="2026-04-16 13:14:37.240887434 +0000 UTC m=+182.123243503" lastFinishedPulling="2026-04-16 13:14:44.441108595 +0000 UTC m=+189.323464677" observedRunningTime="2026-04-16 13:14:45.372240136 +0000 UTC m=+190.254596226" watchObservedRunningTime="2026-04-16 13:14:45.374403264 +0000 UTC m=+190.256759353" Apr 16 13:14:47.067137 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:47.067098 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:14:55.252456 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:55.252424 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:14:55.252456 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:14:55.252465 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:15:10.422421 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:10.422386 2571 generic.go:358] "Generic (PLEG): container finished" podID="4634cc1d-d3ff-44bd-9edd-28902d1bbd65" containerID="f64d233a82abbc5d82ada7bfef645ba8acec8e79cc00f72c9b8c5e0cef707df4" exitCode=0 Apr 16 13:15:10.422780 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:10.422458 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" event={"ID":"4634cc1d-d3ff-44bd-9edd-28902d1bbd65","Type":"ContainerDied","Data":"f64d233a82abbc5d82ada7bfef645ba8acec8e79cc00f72c9b8c5e0cef707df4"} Apr 16 13:15:10.422821 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:10.422799 2571 scope.go:117] "RemoveContainer" containerID="f64d233a82abbc5d82ada7bfef645ba8acec8e79cc00f72c9b8c5e0cef707df4" Apr 16 13:15:11.426709 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:11.426676 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-xdltb" event={"ID":"4634cc1d-d3ff-44bd-9edd-28902d1bbd65","Type":"ContainerStarted","Data":"75b3495bf4bc6c614ad4823a484cf4ee1b35dab2c3068d9bd3200aba68d4e0dd"} Apr 16 13:15:15.257639 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:15.257605 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:15:15.261580 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:15.261558 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6685444475-7kmjj" Apr 16 13:15:37.067488 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:37.067453 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:37.083052 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:37.083027 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:37.523188 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:37.523161 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:47.478620 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:47.478541 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:15:47.480770 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:47.480750 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/f57b15af-9441-4822-9c41-048d94ab4c1a-metrics-certs\") pod \"network-metrics-daemon-h8fnx\" (UID: \"f57b15af-9441-4822-9c41-048d94ab4c1a\") " pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:15:47.527089 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:47.527065 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-hkf74\"" Apr 16 13:15:47.534912 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:47.534894 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-h8fnx" Apr 16 13:15:47.861758 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:47.861680 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-h8fnx"] Apr 16 13:15:47.864290 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:15:47.864257 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57b15af_9441_4822_9c41_048d94ab4c1a.slice/crio-ab4fad20b70983270b526b78f68589cc72373c69da0f12cf20525e7f5a2e5644 WatchSource:0}: Error finding container ab4fad20b70983270b526b78f68589cc72373c69da0f12cf20525e7f5a2e5644: Status 404 returned error can't find the container with id ab4fad20b70983270b526b78f68589cc72373c69da0f12cf20525e7f5a2e5644 Apr 16 13:15:48.541290 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:48.541251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h8fnx" event={"ID":"f57b15af-9441-4822-9c41-048d94ab4c1a","Type":"ContainerStarted","Data":"ab4fad20b70983270b526b78f68589cc72373c69da0f12cf20525e7f5a2e5644"} Apr 16 13:15:49.545350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:49.545314 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h8fnx" event={"ID":"f57b15af-9441-4822-9c41-048d94ab4c1a","Type":"ContainerStarted","Data":"9a1a4a9f1a202fc2a40438508459640b2468db5b6af76690c5abfd4d1e49403d"} Apr 16 13:15:49.545350 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:49.545355 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-h8fnx" event={"ID":"f57b15af-9441-4822-9c41-048d94ab4c1a","Type":"ContainerStarted","Data":"9ed099c2ccdaa46d5dc88aabf329218b52761338614aca6b898186d527733e63"} Apr 16 13:15:49.561457 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:49.561404 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-h8fnx" podStartSLOduration=253.658146615 podStartE2EDuration="4m14.561391109s" podCreationTimestamp="2026-04-16 13:11:35 +0000 UTC" firstStartedPulling="2026-04-16 13:15:47.866128741 +0000 UTC m=+252.748484810" lastFinishedPulling="2026-04-16 13:15:48.769373222 +0000 UTC m=+253.651729304" observedRunningTime="2026-04-16 13:15:49.560313247 +0000 UTC m=+254.442669334" watchObservedRunningTime="2026-04-16 13:15:49.561391109 +0000 UTC m=+254.443747199" Apr 16 13:15:55.135901 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.135853 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:15:55.136890 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.136843 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy" containerID="cri-o://02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" gracePeriod=600 Apr 16 13:15:55.137021 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.136899 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="thanos-sidecar" containerID="cri-o://24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" gracePeriod=600 Apr 16 13:15:55.137021 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.136907 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-thanos" containerID="cri-o://34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" gracePeriod=600 Apr 16 13:15:55.137136 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.137020 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="config-reloader" containerID="cri-o://7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" gracePeriod=600 Apr 16 13:15:55.137136 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.137017 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-web" containerID="cri-o://de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" gracePeriod=600 Apr 16 13:15:55.137136 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.136838 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="prometheus" containerID="cri-o://0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" gracePeriod=600 Apr 16 13:15:55.383399 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.383050 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.543794 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.543766 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-kube-rbac-proxy\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.543982 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.543809 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-tls\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.543982 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.543837 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-kubelet-serving-ca-bundle\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.543982 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.543901 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jwfsk\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-kube-api-access-jwfsk\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544140 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544073 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-db\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544140 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544107 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544140 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544132 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-metrics-client-ca\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544292 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544166 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-serving-certs-ca-bundle\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544292 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544213 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config-out\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544292 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544256 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:55.544733 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544667 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:55.544733 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544703 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:55.544733 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544728 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-web-config\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544976 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544777 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-grpc-tls\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544976 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544809 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-metrics-client-certs\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544976 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544844 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-rulefiles-0\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544976 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544887 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.544976 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544937 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-trusted-ca-bundle\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.545227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.544982 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.545227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.545014 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-tls-assets\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.545227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.545039 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-thanos-prometheus-http-client-file\") pod \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\" (UID: \"a8e59d14-0f39-46aa-9acb-073f7dc47e12\") " Apr 16 13:15:55.545382 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.545308 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-metrics-client-ca\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.545382 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.545327 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.545382 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.545342 2571 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.545545 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.545393 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:15:55.546311 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.546281 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:55.546599 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.546556 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.547047 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.547018 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:15:55.547989 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.547943 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config" (OuterVolumeSpecName: "config") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.548243 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548215 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.548343 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548240 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:55.548343 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548271 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.548462 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548383 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-kube-api-access-jwfsk" (OuterVolumeSpecName: "kube-api-access-jwfsk") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "kube-api-access-jwfsk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:15:55.548656 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548626 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.548744 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548673 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.548899 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.548856 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config-out" (OuterVolumeSpecName: "config-out") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:15:55.549273 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.549249 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.549437 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.549409 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.558059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.558040 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-web-config" (OuterVolumeSpecName: "web-config") pod "a8e59d14-0f39-46aa-9acb-073f7dc47e12" (UID: "a8e59d14-0f39-46aa-9acb-073f7dc47e12"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 13:15:55.567018 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.566998 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" exitCode=0 Apr 16 13:15:55.567018 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567016 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" exitCode=0 Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567022 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" exitCode=0 Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567028 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" exitCode=0 Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567033 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" exitCode=0 Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567037 2571 generic.go:358] "Generic (PLEG): container finished" podID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" exitCode=0 Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567067 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567097 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567099 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567109 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567118 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567127 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} Apr 16 13:15:55.567135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567136 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} Apr 16 13:15:55.567483 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567144 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a8e59d14-0f39-46aa-9acb-073f7dc47e12","Type":"ContainerDied","Data":"05fff3f2f959e2b932287fd7b3b6bb1c35983ef803211ea5a871bcf2327fd33c"} Apr 16 13:15:55.567483 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.567168 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.575051 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.575035 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.581352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.581334 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.587422 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.587407 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.592880 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.592849 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:15:55.593822 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.593800 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.597512 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.597492 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:15:55.600337 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.600322 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.606646 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.606631 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.615038 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615020 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.615277 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.615259 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.615329 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615288 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} err="failed to get container status \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" Apr 16 13:15:55.615370 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615329 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.615559 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.615542 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.615623 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615569 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} err="failed to get container status \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" Apr 16 13:15:55.615623 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615593 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.615832 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.615815 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.615888 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615837 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} err="failed to get container status \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" Apr 16 13:15:55.615888 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.615851 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.616113 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.616094 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.616148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616118 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} err="failed to get container status \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" Apr 16 13:15:55.616148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616131 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.616350 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.616331 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.616392 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616355 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} err="failed to get container status \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" Apr 16 13:15:55.616392 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616368 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.616574 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.616559 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.616613 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616578 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} err="failed to get container status \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" Apr 16 13:15:55.616613 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616592 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.616787 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:15:55.616771 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.616848 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616789 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} err="failed to get container status \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" Apr 16 13:15:55.616848 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616801 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.617006 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.616986 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} err="failed to get container status \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" Apr 16 13:15:55.617006 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617006 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.617191 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617169 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} err="failed to get container status \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" Apr 16 13:15:55.617191 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617190 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.617406 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617388 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} err="failed to get container status \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" Apr 16 13:15:55.617461 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617407 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.617589 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617573 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} err="failed to get container status \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" Apr 16 13:15:55.617629 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617589 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.617800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617783 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} err="failed to get container status \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" Apr 16 13:15:55.617844 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.617801 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.618027 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618008 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} err="failed to get container status \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" Apr 16 13:15:55.618095 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618030 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.618263 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618246 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} err="failed to get container status \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" Apr 16 13:15:55.618307 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618264 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.618452 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618435 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} err="failed to get container status \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" Apr 16 13:15:55.618519 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618455 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.618678 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618653 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} err="failed to get container status \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" Apr 16 13:15:55.618678 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618673 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.618861 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618845 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} err="failed to get container status \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" Apr 16 13:15:55.618939 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.618862 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.619140 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619118 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} err="failed to get container status \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" Apr 16 13:15:55.619140 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619140 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.619352 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619333 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} err="failed to get container status \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" Apr 16 13:15:55.619393 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619353 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.619547 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619527 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} err="failed to get container status \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" Apr 16 13:15:55.619547 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619547 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.619744 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619729 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} err="failed to get container status \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" Apr 16 13:15:55.619784 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619744 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.619966 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619947 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} err="failed to get container status \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" Apr 16 13:15:55.620036 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.619967 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.620178 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620164 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} err="failed to get container status \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" Apr 16 13:15:55.620226 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620180 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.620341 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620326 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} err="failed to get container status \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" Apr 16 13:15:55.620385 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620341 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.620537 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620517 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} err="failed to get container status \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" Apr 16 13:15:55.620608 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620538 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.620785 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620761 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} err="failed to get container status \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" Apr 16 13:15:55.620988 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.620787 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.621182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.621097 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} err="failed to get container status \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" Apr 16 13:15:55.621182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.621122 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.621511 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.621490 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} err="failed to get container status \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" Apr 16 13:15:55.621609 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.621512 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.621749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.621727 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} err="failed to get container status \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" Apr 16 13:15:55.621819 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.621750 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.622057 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622013 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} err="failed to get container status \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" Apr 16 13:15:55.622057 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622031 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.622252 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622233 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} err="failed to get container status \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" Apr 16 13:15:55.622322 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622253 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.622467 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622449 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} err="failed to get container status \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" Apr 16 13:15:55.622525 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622470 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.622702 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622687 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} err="failed to get container status \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" Apr 16 13:15:55.622702 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622701 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.622969 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622946 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} err="failed to get container status \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" Apr 16 13:15:55.623050 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.622971 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.623104 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623049 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:15:55.623211 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623190 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} err="failed to get container status \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" Apr 16 13:15:55.623271 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623211 2571 scope.go:117] "RemoveContainer" containerID="34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601" Apr 16 13:15:55.623376 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623362 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="config-reloader" Apr 16 13:15:55.623423 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623378 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="config-reloader" Apr 16 13:15:55.623423 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623390 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="thanos-sidecar" Apr 16 13:15:55.623423 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623398 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="thanos-sidecar" Apr 16 13:15:55.623423 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623412 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601"} err="failed to get container status \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": rpc error: code = NotFound desc = could not find container \"34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601\": container with ID starting with 34448bedf88f7d13966498383e522ffc1c3f0892a0b264cf6d74ec99eb5c8601 not found: ID does not exist" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623433 2571 scope.go:117] "RemoveContainer" containerID="02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623419 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-thanos" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623482 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-thanos" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623501 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623508 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623520 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="init-config-reloader" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623529 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="init-config-reloader" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623542 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-web" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623550 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-web" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623561 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="prometheus" Apr 16 13:15:55.623569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623570 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="prometheus" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623652 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="config-reloader" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623661 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623667 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="prometheus" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623672 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-web" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623671 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0"} err="failed to get container status \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": rpc error: code = NotFound desc = could not find container \"02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0\": container with ID starting with 02cf1e17662f6192a000278df309c4a7421115424aa093fed276a08d0c76bea0 not found: ID does not exist" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623692 2571 scope.go:117] "RemoveContainer" containerID="de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623679 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="thanos-sidecar" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623752 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" containerName="kube-rbac-proxy-thanos" Apr 16 13:15:55.623923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623915 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c"} err="failed to get container status \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": rpc error: code = NotFound desc = could not find container \"de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c\": container with ID starting with de7bafcd65f5cc77e3c69dd11d960f045814ff959f03e06491e17f37bc24cc5c not found: ID does not exist" Apr 16 13:15:55.624201 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.623930 2571 scope.go:117] "RemoveContainer" containerID="24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928" Apr 16 13:15:55.624201 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.624109 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928"} err="failed to get container status \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": rpc error: code = NotFound desc = could not find container \"24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928\": container with ID starting with 24b87e6613d1798fe0907c84b6a1b39daed3aec648a09862c6b7b04e53e94928 not found: ID does not exist" Apr 16 13:15:55.624201 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.624130 2571 scope.go:117] "RemoveContainer" containerID="7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1" Apr 16 13:15:55.624345 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.624318 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1"} err="failed to get container status \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": rpc error: code = NotFound desc = could not find container \"7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1\": container with ID starting with 7675c2c0d7aa551906f1d89596f1054e5d79c2da8dd58c85d28727ab40524fc1 not found: ID does not exist" Apr 16 13:15:55.624345 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.624341 2571 scope.go:117] "RemoveContainer" containerID="0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757" Apr 16 13:15:55.624834 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.624812 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757"} err="failed to get container status \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": rpc error: code = NotFound desc = could not find container \"0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757\": container with ID starting with 0903a64fc41df686d1641b65c578adc5608b4b501aa8cea62cff6107b1cd4757 not found: ID does not exist" Apr 16 13:15:55.624834 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.624834 2571 scope.go:117] "RemoveContainer" containerID="60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb" Apr 16 13:15:55.626555 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.625103 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb"} err="failed to get container status \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": rpc error: code = NotFound desc = could not find container \"60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb\": container with ID starting with 60890d8ee1f26ae5ad7c8adde2419d91c678618190270c6ad9a2712f4467ebcb not found: ID does not exist" Apr 16 13:15:55.628572 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.628555 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.631325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631141 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 13:15:55.631325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631167 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-49uk17uc5i1ir\"" Apr 16 13:15:55.631325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631177 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 13:15:55.631325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631204 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 13:15:55.631325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631219 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 13:15:55.631615 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631557 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 13:15:55.631615 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631584 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 13:15:55.631995 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.631977 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 13:15:55.632142 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.632018 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 13:15:55.632142 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.632067 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 13:15:55.632250 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.632224 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-qqm9p\"" Apr 16 13:15:55.632291 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.632249 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 13:15:55.634727 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.634708 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 13:15:55.641541 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.640051 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 13:15:55.641541 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.641154 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:15:55.646074 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646053 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-tls\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646074 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646072 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jwfsk\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-kube-api-access-jwfsk\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646084 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-db\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646094 2571 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646103 2571 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a8e59d14-0f39-46aa-9acb-073f7dc47e12-config-out\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646111 2571 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-web-config\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646118 2571 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-grpc-tls\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646126 2571 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-metrics-client-certs\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646134 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646143 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646152 2571 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a8e59d14-0f39-46aa-9acb-073f7dc47e12-prometheus-trusted-ca-bundle\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646161 2571 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646169 2571 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a8e59d14-0f39-46aa-9acb-073f7dc47e12-tls-assets\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646178 2571 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-thanos-prometheus-http-client-file\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.646228 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.646186 2571 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a8e59d14-0f39-46aa-9acb-073f7dc47e12-secret-kube-rbac-proxy\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:15:55.723937 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.723912 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e59d14-0f39-46aa-9acb-073f7dc47e12" path="/var/lib/kubelet/pods/a8e59d14-0f39-46aa-9acb-073f7dc47e12/volumes" Apr 16 13:15:55.747323 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747296 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747432 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747345 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40e30a81-7c1d-468e-b2be-f495297f613d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747432 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747370 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40e30a81-7c1d-468e-b2be-f495297f613d-config-out\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747432 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747395 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747432 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747420 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747641 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747473 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747641 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747507 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747641 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747539 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747641 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747577 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747641 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747604 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747641 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747627 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747683 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-config\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747704 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747719 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747736 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747751 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747766 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz6gr\" (UniqueName: \"kubernetes.io/projected/40e30a81-7c1d-468e-b2be-f495297f613d-kube-api-access-tz6gr\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.747862 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.747783 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-web-config\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848803 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848730 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848803 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848784 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40e30a81-7c1d-468e-b2be-f495297f613d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848810 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40e30a81-7c1d-468e-b2be-f495297f613d-config-out\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848833 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848858 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848913 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848946 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.848992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.848975 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849278 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849012 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849278 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849040 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849278 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849064 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849278 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849111 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-config\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849278 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849278 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849179 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.849899 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.849856 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.850107 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.850082 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.850182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.850138 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.850182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.850170 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.850288 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.850197 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.850288 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.850222 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz6gr\" (UniqueName: \"kubernetes.io/projected/40e30a81-7c1d-468e-b2be-f495297f613d-kube-api-access-tz6gr\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.850288 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.850250 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-web-config\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.851923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40e30a81-7c1d-468e-b2be-f495297f613d-config-out\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.851943 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.851967 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-config\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.851945 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40e30a81-7c1d-468e-b2be-f495297f613d-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.852069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.852541 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.852572 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.852640 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.852793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.852742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-web-config\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.853702 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.853679 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40e30a81-7c1d-468e-b2be-f495297f613d-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.854090 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.854071 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.854374 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.854353 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.854495 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.854478 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.854739 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.854723 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40e30a81-7c1d-468e-b2be-f495297f613d-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.859618 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.859601 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz6gr\" (UniqueName: \"kubernetes.io/projected/40e30a81-7c1d-468e-b2be-f495297f613d-kube-api-access-tz6gr\") pod \"prometheus-k8s-0\" (UID: \"40e30a81-7c1d-468e-b2be-f495297f613d\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:55.939745 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:55.939715 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:15:56.065082 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:56.065056 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 13:15:56.066159 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:15:56.066136 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40e30a81_7c1d_468e_b2be_f495297f613d.slice/crio-29cafffd4cbc408d500f3762087efae9fe03b6101290b83aa6bc4e28d62705b0 WatchSource:0}: Error finding container 29cafffd4cbc408d500f3762087efae9fe03b6101290b83aa6bc4e28d62705b0: Status 404 returned error can't find the container with id 29cafffd4cbc408d500f3762087efae9fe03b6101290b83aa6bc4e28d62705b0 Apr 16 13:15:56.571992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:56.571959 2571 generic.go:358] "Generic (PLEG): container finished" podID="40e30a81-7c1d-468e-b2be-f495297f613d" containerID="609ee64b4b7f25e9a36a0d4adf66ecfaa9991f814d90916ba3f6c375dcba2c7e" exitCode=0 Apr 16 13:15:56.572367 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:56.572046 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerDied","Data":"609ee64b4b7f25e9a36a0d4adf66ecfaa9991f814d90916ba3f6c375dcba2c7e"} Apr 16 13:15:56.572367 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:56.572080 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"29cafffd4cbc408d500f3762087efae9fe03b6101290b83aa6bc4e28d62705b0"} Apr 16 13:15:57.579025 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.578991 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"d17cfd207b84651aa7c65b8f7ed2fed460f3b687e0933c07c83a6a781b65720b"} Apr 16 13:15:57.579025 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.579025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"503038b8bd7a0a5eff1bdd61ca836258f066cad6788ea980797bbabfc5c207fc"} Apr 16 13:15:57.579025 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.579034 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"d112554b566e5f5be08a5ba245ddf7d58af7db92def6d1e53200737f0193f4f4"} Apr 16 13:15:57.579437 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.579043 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"4e8934447dec4ffd25e93a7fef3e58a1da561eb41d2ab439e2d8d5f447d61330"} Apr 16 13:15:57.579437 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.579051 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"e7c27a986a55d3964205808fc810a9a25a9fd3603c2bb5febd16319dda84fc5a"} Apr 16 13:15:57.579437 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.579059 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40e30a81-7c1d-468e-b2be-f495297f613d","Type":"ContainerStarted","Data":"0a092c4eddd581e217fe4af44ac812460ee345fd39490d76b9217201f34709e9"} Apr 16 13:15:57.608602 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:15:57.608540 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.608527021 podStartE2EDuration="2.608527021s" podCreationTimestamp="2026-04-16 13:15:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:15:57.60632722 +0000 UTC m=+262.488683337" watchObservedRunningTime="2026-04-16 13:15:57.608527021 +0000 UTC m=+262.490883111" Apr 16 13:16:00.940386 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:00.940352 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:16:35.610903 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:35.610850 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:16:35.611291 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:35.611069 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:16:35.615758 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:35.615740 2571 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 13:16:55.940381 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:55.940348 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:16:55.955449 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:55.955422 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:16:56.772744 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:16:56.772718 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 13:17:37.861461 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.861383 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c"] Apr 16 13:17:37.864724 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.864705 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:37.867253 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.867231 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 13:17:37.867360 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.867237 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 13:17:37.867360 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.867296 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vt9jf\"" Apr 16 13:17:37.867360 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.867313 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 13:17:37.868279 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.868249 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 13:17:37.868341 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.868288 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:17:37.874631 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.874612 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c"] Apr 16 13:17:37.900568 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.900548 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b054e48f-a869-44ce-982d-a9ca5b8dc577-metrics-cert\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:37.900666 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.900585 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgdxz\" (UniqueName: \"kubernetes.io/projected/b054e48f-a869-44ce-982d-a9ca5b8dc577-kube-api-access-fgdxz\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:37.900666 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.900618 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b054e48f-a869-44ce-982d-a9ca5b8dc577-manager-config\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:37.900739 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:37.900663 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b054e48f-a869-44ce-982d-a9ca5b8dc577-cert\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.002069 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.002042 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b054e48f-a869-44ce-982d-a9ca5b8dc577-metrics-cert\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.002215 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.002085 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgdxz\" (UniqueName: \"kubernetes.io/projected/b054e48f-a869-44ce-982d-a9ca5b8dc577-kube-api-access-fgdxz\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.002215 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.002116 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b054e48f-a869-44ce-982d-a9ca5b8dc577-manager-config\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.002215 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.002132 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b054e48f-a869-44ce-982d-a9ca5b8dc577-cert\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.002661 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.002639 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/b054e48f-a869-44ce-982d-a9ca5b8dc577-manager-config\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.004491 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.004468 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b054e48f-a869-44ce-982d-a9ca5b8dc577-cert\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.004637 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.004619 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/b054e48f-a869-44ce-982d-a9ca5b8dc577-metrics-cert\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.012979 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.012953 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgdxz\" (UniqueName: \"kubernetes.io/projected/b054e48f-a869-44ce-982d-a9ca5b8dc577-kube-api-access-fgdxz\") pod \"lws-controller-manager-64f4647cd-vjj4c\" (UID: \"b054e48f-a869-44ce-982d-a9ca5b8dc577\") " pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.175470 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.175392 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:38.291778 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.291754 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c"] Apr 16 13:17:38.294182 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:17:38.294153 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb054e48f_a869_44ce_982d_a9ca5b8dc577.slice/crio-f13d4f854484d62301e40d8f8479293ec2c7bbdfabf1564156f0dcb94492aeaa WatchSource:0}: Error finding container f13d4f854484d62301e40d8f8479293ec2c7bbdfabf1564156f0dcb94492aeaa: Status 404 returned error can't find the container with id f13d4f854484d62301e40d8f8479293ec2c7bbdfabf1564156f0dcb94492aeaa Apr 16 13:17:38.295955 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.295937 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:17:38.875098 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:38.875067 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" event={"ID":"b054e48f-a869-44ce-982d-a9ca5b8dc577","Type":"ContainerStarted","Data":"f13d4f854484d62301e40d8f8479293ec2c7bbdfabf1564156f0dcb94492aeaa"} Apr 16 13:17:41.885911 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:41.885855 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" event={"ID":"b054e48f-a869-44ce-982d-a9ca5b8dc577","Type":"ContainerStarted","Data":"43069df32e45f4cbd97b30e288bb24fa1cb0c061e391b2fdeb575074bafdf083"} Apr 16 13:17:41.886355 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:41.885994 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:17:41.903048 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:41.902996 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" podStartSLOduration=1.842019887 podStartE2EDuration="4.902984538s" podCreationTimestamp="2026-04-16 13:17:37 +0000 UTC" firstStartedPulling="2026-04-16 13:17:38.296077075 +0000 UTC m=+363.178433142" lastFinishedPulling="2026-04-16 13:17:41.357041714 +0000 UTC m=+366.239397793" observedRunningTime="2026-04-16 13:17:41.902138414 +0000 UTC m=+366.784494505" watchObservedRunningTime="2026-04-16 13:17:41.902984538 +0000 UTC m=+366.785340626" Apr 16 13:17:46.931216 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.931188 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg"] Apr 16 13:17:46.934341 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.934322 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:46.937370 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.937351 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 16 13:17:46.937781 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.937751 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 16 13:17:46.937903 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.937813 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-5gs69\"" Apr 16 13:17:46.938172 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.938154 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 16 13:17:46.938448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.938433 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 16 13:17:46.957283 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.957263 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg"] Apr 16 13:17:46.967960 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.967939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76x6r\" (UniqueName: \"kubernetes.io/projected/cc078a07-4755-4732-9de2-e597e3972b03-kube-api-access-76x6r\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:46.968049 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.967994 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc078a07-4755-4732-9de2-e597e3972b03-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:46.968112 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:46.968095 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc078a07-4755-4732-9de2-e597e3972b03-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.069294 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.069268 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc078a07-4755-4732-9de2-e597e3972b03-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.069395 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.069307 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76x6r\" (UniqueName: \"kubernetes.io/projected/cc078a07-4755-4732-9de2-e597e3972b03-kube-api-access-76x6r\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.069395 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.069352 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc078a07-4755-4732-9de2-e597e3972b03-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.071731 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.071705 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/cc078a07-4755-4732-9de2-e597e3972b03-webhook-cert\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.071810 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.071705 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/cc078a07-4755-4732-9de2-e597e3972b03-apiservice-cert\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.079410 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.079382 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76x6r\" (UniqueName: \"kubernetes.io/projected/cc078a07-4755-4732-9de2-e597e3972b03-kube-api-access-76x6r\") pod \"opendatahub-operator-controller-manager-5889847794-zg6lg\" (UID: \"cc078a07-4755-4732-9de2-e597e3972b03\") " pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.245319 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.245290 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:47.385736 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.385701 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg"] Apr 16 13:17:47.389948 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:17:47.389922 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc078a07_4755_4732_9de2_e597e3972b03.slice/crio-7dd2e98488f3615b07923812c5a7eb5bc2bd6f850b674cf1f5c5779fa78fdb18 WatchSource:0}: Error finding container 7dd2e98488f3615b07923812c5a7eb5bc2bd6f850b674cf1f5c5779fa78fdb18: Status 404 returned error can't find the container with id 7dd2e98488f3615b07923812c5a7eb5bc2bd6f850b674cf1f5c5779fa78fdb18 Apr 16 13:17:47.906304 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:47.906273 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" event={"ID":"cc078a07-4755-4732-9de2-e597e3972b03","Type":"ContainerStarted","Data":"7dd2e98488f3615b07923812c5a7eb5bc2bd6f850b674cf1f5c5779fa78fdb18"} Apr 16 13:17:50.917987 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:50.917947 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" event={"ID":"cc078a07-4755-4732-9de2-e597e3972b03","Type":"ContainerStarted","Data":"7dae3fb6d8a4f9c255aa44230c7c23e3bd8a33202dc615063dc76c85c3226be5"} Apr 16 13:17:50.918379 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:50.918105 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:17:50.945133 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:50.944952 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" podStartSLOduration=2.277771428 podStartE2EDuration="4.944935012s" podCreationTimestamp="2026-04-16 13:17:46 +0000 UTC" firstStartedPulling="2026-04-16 13:17:47.392510677 +0000 UTC m=+372.274866746" lastFinishedPulling="2026-04-16 13:17:50.059674256 +0000 UTC m=+374.942030330" observedRunningTime="2026-04-16 13:17:50.944512659 +0000 UTC m=+375.826868762" watchObservedRunningTime="2026-04-16 13:17:50.944935012 +0000 UTC m=+375.827291108" Apr 16 13:17:52.891173 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:17:52.891145 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-64f4647cd-vjj4c" Apr 16 13:18:01.923233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:01.923203 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-5889847794-zg6lg" Apr 16 13:18:37.793580 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.793552 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx"] Apr 16 13:18:37.800040 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.800019 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.802561 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.802528 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 13:18:37.802668 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.802571 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-wb4hl\"" Apr 16 13:18:37.810480 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.810456 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx"] Apr 16 13:18:37.975045 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975017 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4bs\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-kube-api-access-tz4bs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975050 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975122 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975182 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975146 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/97c61d61-84d0-4f58-a6a0-573eb48233ab-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975207 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975231 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975278 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:37.975409 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:37.975299 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.068884 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.068794 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h"] Apr 16 13:18:38.071131 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.071115 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.076161 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076135 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4bs\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-kube-api-access-tz4bs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076285 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076173 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076285 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076235 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076285 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076263 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/97c61d61-84d0-4f58-a6a0-573eb48233ab-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076439 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076311 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076439 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076329 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076439 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076439 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076372 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076439 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076389 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.076698 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.076681 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.077126 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.077101 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.077385 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.077181 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.077811 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.077788 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-data\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.078091 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.078069 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/97c61d61-84d0-4f58-a6a0-573eb48233ab-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.079116 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.079081 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.079359 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.079335 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.081924 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.081904 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h"] Apr 16 13:18:38.085904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.085858 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4bs\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-kube-api-access-tz4bs\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.085988 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.085964 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-token\") pod \"data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.113550 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.113523 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:38.176911 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.176848 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177090 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177073 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177204 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177188 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177271 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177259 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177324 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177287 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177379 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177361 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177456 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177440 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177584 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177551 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.177673 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.177658 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lff2g\" (UniqueName: \"kubernetes.io/projected/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-kube-api-access-lff2g\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.234721 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.234655 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx"] Apr 16 13:18:38.244828 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:18:38.244799 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97c61d61_84d0_4f58_a6a0_573eb48233ab.slice/crio-f2734216d937a4773de9a68999439d964f045d66e49fafd3f87eb463ff2d7a41 WatchSource:0}: Error finding container f2734216d937a4773de9a68999439d964f045d66e49fafd3f87eb463ff2d7a41: Status 404 returned error can't find the container with id f2734216d937a4773de9a68999439d964f045d66e49fafd3f87eb463ff2d7a41 Apr 16 13:18:38.279199 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279172 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lff2g\" (UniqueName: \"kubernetes.io/projected/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-kube-api-access-lff2g\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279311 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279213 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279311 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279236 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279311 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279253 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279311 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279277 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279311 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279300 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279585 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279318 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279585 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279344 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279585 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279400 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279731 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279651 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279731 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279666 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.279944 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279923 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.280019 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.279958 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-data\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.280061 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.280044 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.281386 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.281366 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.281638 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.281621 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.286595 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.286574 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lff2g\" (UniqueName: \"kubernetes.io/projected/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-kube-api-access-lff2g\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.286663 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.286596 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b-istio-token\") pod \"data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h\" (UID: \"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.381837 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.381775 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:38.497489 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:38.497459 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h"] Apr 16 13:18:38.500562 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:18:38.500537 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7aa9d18e_6e93_4aaf_ac74_0d44e9a9486b.slice/crio-1c51dad7fa3b865e72c3f845bb5b33348707b652f0a824c1013b33110cf45926 WatchSource:0}: Error finding container 1c51dad7fa3b865e72c3f845bb5b33348707b652f0a824c1013b33110cf45926: Status 404 returned error can't find the container with id 1c51dad7fa3b865e72c3f845bb5b33348707b652f0a824c1013b33110cf45926 Apr 16 13:18:39.072498 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:39.072462 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" event={"ID":"97c61d61-84d0-4f58-a6a0-573eb48233ab","Type":"ContainerStarted","Data":"f2734216d937a4773de9a68999439d964f045d66e49fafd3f87eb463ff2d7a41"} Apr 16 13:18:39.073449 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:39.073424 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" event={"ID":"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b","Type":"ContainerStarted","Data":"1c51dad7fa3b865e72c3f845bb5b33348707b652f0a824c1013b33110cf45926"} Apr 16 13:18:41.697793 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:41.697754 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:18:41.698084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:41.697828 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:18:41.698084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:41.697855 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:18:42.091299 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:42.091219 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" event={"ID":"97c61d61-84d0-4f58-a6a0-573eb48233ab","Type":"ContainerStarted","Data":"5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf"} Apr 16 13:18:42.112338 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:42.112292 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podStartSLOduration=1.662049675 podStartE2EDuration="5.112279517s" podCreationTimestamp="2026-04-16 13:18:37 +0000 UTC" firstStartedPulling="2026-04-16 13:18:38.247269432 +0000 UTC m=+423.129625501" lastFinishedPulling="2026-04-16 13:18:41.697499264 +0000 UTC m=+426.579855343" observedRunningTime="2026-04-16 13:18:42.110161361 +0000 UTC m=+426.992517452" watchObservedRunningTime="2026-04-16 13:18:42.112279517 +0000 UTC m=+426.994635607" Apr 16 13:18:42.113817 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:42.113790 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:42.115096 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:42.115076 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:42.115207 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:42.115127 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:43.113918 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:43.113888 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:43.114346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:43.113938 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:44.114549 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:44.114518 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:44.114931 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:44.114576 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:45.114469 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:45.114393 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:45.114469 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:45.114443 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:46.114233 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:46.114204 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:46.114388 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:46.114254 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:47.114089 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:47.114056 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:47.114459 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:47.114109 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:48.114051 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:48.114020 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:48.114235 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:48.114090 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:49.114457 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:49.114420 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:49.114812 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:49.114483 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:50.114503 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:50.114469 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:50.114933 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:50.114524 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:51.114174 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:51.114138 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:51.114330 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:51.114203 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:52.114258 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:52.114225 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:52.114631 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:52.114285 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:52.724264 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:52.724232 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:18:52.724342 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:52.724299 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:18:52.724342 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:52.724330 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:18:53.114492 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.114409 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:53.114946 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.114580 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:53.114946 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.114627 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:53.130475 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.130451 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" event={"ID":"7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b","Type":"ContainerStarted","Data":"56a4e7d5a3edc5da6d28f87859724c190b1c95001474311b2a1e72564769e58d"} Apr 16 13:18:53.160524 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.156652 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" podStartSLOduration=0.93537943 podStartE2EDuration="15.156615669s" podCreationTimestamp="2026-04-16 13:18:38 +0000 UTC" firstStartedPulling="2026-04-16 13:18:38.502780216 +0000 UTC m=+423.385136283" lastFinishedPulling="2026-04-16 13:18:52.724016439 +0000 UTC m=+437.606372522" observedRunningTime="2026-04-16 13:18:53.152539989 +0000 UTC m=+438.034896079" watchObservedRunningTime="2026-04-16 13:18:53.156615669 +0000 UTC m=+438.038971763" Apr 16 13:18:53.382252 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.382170 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:53.382252 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.382206 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:53.386286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:53.386262 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:54.114107 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:54.114069 2571 patch_prober.go:28] interesting pod/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx container/istio-proxy namespace/openshift-ingress: Startup probe status=failure output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" start-of-body= Apr 16 13:18:54.114299 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:54.114147 2571 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" probeResult="failure" output="Get \"http://10.133.0.24:15021/healthz/ready\": dial tcp 10.133.0.24:15021: connect: connection refused" Apr 16 13:18:54.135026 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:54.135001 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h" Apr 16 13:18:54.184669 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:54.184632 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx"] Apr 16 13:18:54.184940 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:54.184912 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" containerID="cri-o://5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf" gracePeriod=30 Apr 16 13:18:59.424803 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.424778 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:18:59.548619 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548551 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz4bs\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-kube-api-access-tz4bs\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548619 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548610 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-data\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548631 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-podinfo\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548653 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-credential-socket\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548679 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-token\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548700 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-socket\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548743 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/97c61d61-84d0-4f58-a6a0-573eb48233ab-istiod-ca-cert\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.548808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548791 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-certs\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.549148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.548825 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-envoy\") pod \"97c61d61-84d0-4f58-a6a0-573eb48233ab\" (UID: \"97c61d61-84d0-4f58-a6a0-573eb48233ab\") " Apr 16 13:18:59.549148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549012 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-credential-socket" (OuterVolumeSpecName: "credential-socket") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "credential-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:59.549148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549020 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-data" (OuterVolumeSpecName: "istio-data") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "istio-data". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:59.549148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549131 2571 reconciler_common.go:299] "Volume detached for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-credential-socket\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.549371 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549151 2571 reconciler_common.go:299] "Volume detached for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-data\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.549371 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549234 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-certs" (OuterVolumeSpecName: "workload-certs") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "workload-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:59.549371 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549299 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-socket" (OuterVolumeSpecName: "workload-socket") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "workload-socket". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:59.549371 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.549330 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97c61d61-84d0-4f58-a6a0-573eb48233ab-istiod-ca-cert" (OuterVolumeSpecName: "istiod-ca-cert") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "istiod-ca-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 13:18:59.550981 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.550949 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-kube-api-access-tz4bs" (OuterVolumeSpecName: "kube-api-access-tz4bs") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "kube-api-access-tz4bs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:18:59.551096 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.550983 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-podinfo" (OuterVolumeSpecName: "istio-podinfo") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "istio-podinfo". PluginName "kubernetes.io/downward-api", VolumeGIDValue "" Apr 16 13:18:59.551293 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.551265 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-envoy" (OuterVolumeSpecName: "istio-envoy") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "istio-envoy". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:18:59.551403 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.551381 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-token" (OuterVolumeSpecName: "istio-token") pod "97c61d61-84d0-4f58-a6a0-573eb48233ab" (UID: "97c61d61-84d0-4f58-a6a0-573eb48233ab"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:18:59.649839 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649812 2571 reconciler_common.go:299] "Volume detached for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-podinfo\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.649839 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649835 2571 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-token\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.650045 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649849 2571 reconciler_common.go:299] "Volume detached for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-socket\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.650045 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649861 2571 reconciler_common.go:299] "Volume detached for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/97c61d61-84d0-4f58-a6a0-573eb48233ab-istiod-ca-cert\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.650045 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649899 2571 reconciler_common.go:299] "Volume detached for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-workload-certs\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.650045 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649911 2571 reconciler_common.go:299] "Volume detached for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/97c61d61-84d0-4f58-a6a0-573eb48233ab-istio-envoy\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:18:59.650045 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:18:59.649923 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tz4bs\" (UniqueName: \"kubernetes.io/projected/97c61d61-84d0-4f58-a6a0-573eb48233ab-kube-api-access-tz4bs\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:19:00.154321 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.154287 2571 generic.go:358] "Generic (PLEG): container finished" podID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerID="5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf" exitCode=0 Apr 16 13:19:00.154481 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.154352 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" Apr 16 13:19:00.154481 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.154374 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" event={"ID":"97c61d61-84d0-4f58-a6a0-573eb48233ab","Type":"ContainerDied","Data":"5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf"} Apr 16 13:19:00.154481 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.154415 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx" event={"ID":"97c61d61-84d0-4f58-a6a0-573eb48233ab","Type":"ContainerDied","Data":"f2734216d937a4773de9a68999439d964f045d66e49fafd3f87eb463ff2d7a41"} Apr 16 13:19:00.154481 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.154431 2571 scope.go:117] "RemoveContainer" containerID="5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf" Apr 16 13:19:00.162403 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.162390 2571 scope.go:117] "RemoveContainer" containerID="5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf" Apr 16 13:19:00.162682 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:19:00.162663 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf\": container with ID starting with 5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf not found: ID does not exist" containerID="5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf" Apr 16 13:19:00.162754 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.162690 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf"} err="failed to get container status \"5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf\": rpc error: code = NotFound desc = could not find container \"5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf\": container with ID starting with 5451a107e94a41933fa3bd75f7ccf983c77a624bf1d52ecfe821a0a2bf181edf not found: ID does not exist" Apr 16 13:19:00.172707 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.172682 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx"] Apr 16 13:19:00.176491 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:00.176469 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-85d9c6bcd55mcbx"] Apr 16 13:19:01.724135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:01.724102 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" path="/var/lib/kubelet/pods/97c61d61-84d0-4f58-a6a0-573eb48233ab/volumes" Apr 16 13:19:15.119365 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.119328 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7h8q2"] Apr 16 13:19:15.119805 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.119650 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" Apr 16 13:19:15.119805 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.119660 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" Apr 16 13:19:15.119805 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.119727 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="97c61d61-84d0-4f58-a6a0-573eb48233ab" containerName="istio-proxy" Apr 16 13:19:15.125548 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.125527 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:15.128765 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.128742 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 13:19:15.128928 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.128772 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 13:19:15.129014 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.128950 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-7d4fr\"" Apr 16 13:19:15.130075 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.130052 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7h8q2"] Apr 16 13:19:15.273434 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.273406 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jqv8\" (UniqueName: \"kubernetes.io/projected/f291d2da-8272-4543-92b1-d47fd6762980-kube-api-access-4jqv8\") pod \"kuadrant-operator-catalog-7h8q2\" (UID: \"f291d2da-8272-4543-92b1-d47fd6762980\") " pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:15.374780 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.374694 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jqv8\" (UniqueName: \"kubernetes.io/projected/f291d2da-8272-4543-92b1-d47fd6762980-kube-api-access-4jqv8\") pod \"kuadrant-operator-catalog-7h8q2\" (UID: \"f291d2da-8272-4543-92b1-d47fd6762980\") " pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:15.383420 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.383388 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jqv8\" (UniqueName: \"kubernetes.io/projected/f291d2da-8272-4543-92b1-d47fd6762980-kube-api-access-4jqv8\") pod \"kuadrant-operator-catalog-7h8q2\" (UID: \"f291d2da-8272-4543-92b1-d47fd6762980\") " pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:15.435848 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.435823 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:15.491322 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.490337 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7h8q2"] Apr 16 13:19:15.553325 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.553296 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7h8q2"] Apr 16 13:19:15.554940 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:19:15.554913 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf291d2da_8272_4543_92b1_d47fd6762980.slice/crio-83f4d55902d29fc32a266041aaf79fd46c30fc83b37cc713d04d09633bb5fe98 WatchSource:0}: Error finding container 83f4d55902d29fc32a266041aaf79fd46c30fc83b37cc713d04d09633bb5fe98: Status 404 returned error can't find the container with id 83f4d55902d29fc32a266041aaf79fd46c30fc83b37cc713d04d09633bb5fe98 Apr 16 13:19:15.694496 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.694470 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wjgsr"] Apr 16 13:19:15.697334 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.697318 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:15.704464 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.704440 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wjgsr"] Apr 16 13:19:15.778009 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.777984 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wxrcd\" (UniqueName: \"kubernetes.io/projected/fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a-kube-api-access-wxrcd\") pod \"kuadrant-operator-catalog-wjgsr\" (UID: \"fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a\") " pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:15.878627 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.878595 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wxrcd\" (UniqueName: \"kubernetes.io/projected/fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a-kube-api-access-wxrcd\") pod \"kuadrant-operator-catalog-wjgsr\" (UID: \"fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a\") " pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:15.886580 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:15.886560 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wxrcd\" (UniqueName: \"kubernetes.io/projected/fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a-kube-api-access-wxrcd\") pod \"kuadrant-operator-catalog-wjgsr\" (UID: \"fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a\") " pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:16.007336 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:16.007276 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:16.133362 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:16.133307 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-wjgsr"] Apr 16 13:19:16.173027 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:19:16.172996 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfbef5d66_8737_4eaf_8c7e_e9f8e5f7718a.slice/crio-019d4b863f0ba8f83d254bbb4f4aac76c6ee3ce011feeb4113e3b2aaf1899dfd WatchSource:0}: Error finding container 019d4b863f0ba8f83d254bbb4f4aac76c6ee3ce011feeb4113e3b2aaf1899dfd: Status 404 returned error can't find the container with id 019d4b863f0ba8f83d254bbb4f4aac76c6ee3ce011feeb4113e3b2aaf1899dfd Apr 16 13:19:16.215704 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:16.215656 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" event={"ID":"fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a","Type":"ContainerStarted","Data":"019d4b863f0ba8f83d254bbb4f4aac76c6ee3ce011feeb4113e3b2aaf1899dfd"} Apr 16 13:19:16.216786 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:16.216758 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" event={"ID":"f291d2da-8272-4543-92b1-d47fd6762980","Type":"ContainerStarted","Data":"83f4d55902d29fc32a266041aaf79fd46c30fc83b37cc713d04d09633bb5fe98"} Apr 16 13:19:18.225523 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.225488 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" event={"ID":"fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a","Type":"ContainerStarted","Data":"85e41414dc846f40207ab9f3b98e6ca7457d54901b5df4f281b11607ae19b5c5"} Apr 16 13:19:18.226810 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.226785 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" event={"ID":"f291d2da-8272-4543-92b1-d47fd6762980","Type":"ContainerStarted","Data":"00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b"} Apr 16 13:19:18.226985 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.226889 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" podUID="f291d2da-8272-4543-92b1-d47fd6762980" containerName="registry-server" containerID="cri-o://00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b" gracePeriod=2 Apr 16 13:19:18.240180 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.239911 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" podStartSLOduration=1.703567108 podStartE2EDuration="3.239895393s" podCreationTimestamp="2026-04-16 13:19:15 +0000 UTC" firstStartedPulling="2026-04-16 13:19:16.17444441 +0000 UTC m=+461.056800487" lastFinishedPulling="2026-04-16 13:19:17.7107727 +0000 UTC m=+462.593128772" observedRunningTime="2026-04-16 13:19:18.239412926 +0000 UTC m=+463.121769016" watchObservedRunningTime="2026-04-16 13:19:18.239895393 +0000 UTC m=+463.122251498" Apr 16 13:19:18.254668 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.254628 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" podStartSLOduration=1.104471456 podStartE2EDuration="3.254616966s" podCreationTimestamp="2026-04-16 13:19:15 +0000 UTC" firstStartedPulling="2026-04-16 13:19:15.55641408 +0000 UTC m=+460.438770150" lastFinishedPulling="2026-04-16 13:19:17.706559592 +0000 UTC m=+462.588915660" observedRunningTime="2026-04-16 13:19:18.253039994 +0000 UTC m=+463.135396084" watchObservedRunningTime="2026-04-16 13:19:18.254616966 +0000 UTC m=+463.136973056" Apr 16 13:19:18.461921 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.461900 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:18.602000 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.601920 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jqv8\" (UniqueName: \"kubernetes.io/projected/f291d2da-8272-4543-92b1-d47fd6762980-kube-api-access-4jqv8\") pod \"f291d2da-8272-4543-92b1-d47fd6762980\" (UID: \"f291d2da-8272-4543-92b1-d47fd6762980\") " Apr 16 13:19:18.603999 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.603976 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f291d2da-8272-4543-92b1-d47fd6762980-kube-api-access-4jqv8" (OuterVolumeSpecName: "kube-api-access-4jqv8") pod "f291d2da-8272-4543-92b1-d47fd6762980" (UID: "f291d2da-8272-4543-92b1-d47fd6762980"). InnerVolumeSpecName "kube-api-access-4jqv8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:19:18.702412 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:18.702391 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4jqv8\" (UniqueName: \"kubernetes.io/projected/f291d2da-8272-4543-92b1-d47fd6762980-kube-api-access-4jqv8\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:19:19.231695 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.231662 2571 generic.go:358] "Generic (PLEG): container finished" podID="f291d2da-8272-4543-92b1-d47fd6762980" containerID="00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b" exitCode=0 Apr 16 13:19:19.232135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.231718 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" Apr 16 13:19:19.232135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.231760 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" event={"ID":"f291d2da-8272-4543-92b1-d47fd6762980","Type":"ContainerDied","Data":"00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b"} Apr 16 13:19:19.232135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.231798 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-7h8q2" event={"ID":"f291d2da-8272-4543-92b1-d47fd6762980","Type":"ContainerDied","Data":"83f4d55902d29fc32a266041aaf79fd46c30fc83b37cc713d04d09633bb5fe98"} Apr 16 13:19:19.232135 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.231816 2571 scope.go:117] "RemoveContainer" containerID="00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b" Apr 16 13:19:19.240405 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.240387 2571 scope.go:117] "RemoveContainer" containerID="00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b" Apr 16 13:19:19.240645 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:19:19.240627 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b\": container with ID starting with 00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b not found: ID does not exist" containerID="00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b" Apr 16 13:19:19.240708 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.240657 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b"} err="failed to get container status \"00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b\": rpc error: code = NotFound desc = could not find container \"00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b\": container with ID starting with 00ee17c0ab6bd5e6a2be6292df85290f07555505362430df9094ad48608d400b not found: ID does not exist" Apr 16 13:19:19.251771 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.251743 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7h8q2"] Apr 16 13:19:19.254603 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.254584 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-7h8q2"] Apr 16 13:19:19.724623 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:19.724589 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f291d2da-8272-4543-92b1-d47fd6762980" path="/var/lib/kubelet/pods/f291d2da-8272-4543-92b1-d47fd6762980/volumes" Apr 16 13:19:26.008918 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:26.008886 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:26.009391 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:26.008929 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:26.029890 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:26.029852 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:26.275428 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:26.275358 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-wjgsr" Apr 16 13:19:48.423174 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.423138 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27"] Apr 16 13:19:48.423552 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.423483 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f291d2da-8272-4543-92b1-d47fd6762980" containerName="registry-server" Apr 16 13:19:48.423552 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.423495 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="f291d2da-8272-4543-92b1-d47fd6762980" containerName="registry-server" Apr 16 13:19:48.423552 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.423551 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="f291d2da-8272-4543-92b1-d47fd6762980" containerName="registry-server" Apr 16 13:19:48.425518 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.425502 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.428820 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.428801 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-gk7vv\"" Apr 16 13:19:48.441216 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.441195 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27"] Apr 16 13:19:48.519899 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.519850 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8nd7\" (UniqueName: \"kubernetes.io/projected/790bdf2b-3a23-4688-9cd0-bffc63aae543-kube-api-access-j8nd7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.520020 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.519966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/790bdf2b-3a23-4688-9cd0-bffc63aae543-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.620346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.620321 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/790bdf2b-3a23-4688-9cd0-bffc63aae543-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.620459 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.620386 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8nd7\" (UniqueName: \"kubernetes.io/projected/790bdf2b-3a23-4688-9cd0-bffc63aae543-kube-api-access-j8nd7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.620705 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.620683 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/790bdf2b-3a23-4688-9cd0-bffc63aae543-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.630094 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.630074 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8nd7\" (UniqueName: \"kubernetes.io/projected/790bdf2b-3a23-4688-9cd0-bffc63aae543-kube-api-access-j8nd7\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:48.735537 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:48.735516 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:49.071098 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:49.071021 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27"] Apr 16 13:19:49.074580 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:19:49.074557 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod790bdf2b_3a23_4688_9cd0_bffc63aae543.slice/crio-2022bede2d49d7208ba6731b246a01025624a48ec203155b15daba72f7a1bb08 WatchSource:0}: Error finding container 2022bede2d49d7208ba6731b246a01025624a48ec203155b15daba72f7a1bb08: Status 404 returned error can't find the container with id 2022bede2d49d7208ba6731b246a01025624a48ec203155b15daba72f7a1bb08 Apr 16 13:19:49.332590 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:49.332514 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" event={"ID":"790bdf2b-3a23-4688-9cd0-bffc63aae543","Type":"ContainerStarted","Data":"2022bede2d49d7208ba6731b246a01025624a48ec203155b15daba72f7a1bb08"} Apr 16 13:19:53.651932 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.651899 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v"] Apr 16 13:19:53.655113 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.655090 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:19:53.657691 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.657668 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 16 13:19:53.657817 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.657668 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-nh6bz\"" Apr 16 13:19:53.664356 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.664326 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v"] Apr 16 13:19:53.767978 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.767939 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvfjv\" (UniqueName: \"kubernetes.io/projected/1e8ef80b-df28-4c75-b3be-6327f30d026d-kube-api-access-tvfjv\") pod \"dns-operator-controller-manager-648d5c98bc-njk8v\" (UID: \"1e8ef80b-df28-4c75-b3be-6327f30d026d\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:19:53.869181 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.869157 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tvfjv\" (UniqueName: \"kubernetes.io/projected/1e8ef80b-df28-4c75-b3be-6327f30d026d-kube-api-access-tvfjv\") pod \"dns-operator-controller-manager-648d5c98bc-njk8v\" (UID: \"1e8ef80b-df28-4c75-b3be-6327f30d026d\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:19:53.880262 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.880233 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvfjv\" (UniqueName: \"kubernetes.io/projected/1e8ef80b-df28-4c75-b3be-6327f30d026d-kube-api-access-tvfjv\") pod \"dns-operator-controller-manager-648d5c98bc-njk8v\" (UID: \"1e8ef80b-df28-4c75-b3be-6327f30d026d\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:19:53.969027 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:53.969004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:19:54.114050 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:54.114025 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v"] Apr 16 13:19:54.116472 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:19:54.116445 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e8ef80b_df28_4c75_b3be_6327f30d026d.slice/crio-99617e6b7b8ba6eb369d999c311d69029e4b5b14b1fdf672c8bc2e0b4653801e WatchSource:0}: Error finding container 99617e6b7b8ba6eb369d999c311d69029e4b5b14b1fdf672c8bc2e0b4653801e: Status 404 returned error can't find the container with id 99617e6b7b8ba6eb369d999c311d69029e4b5b14b1fdf672c8bc2e0b4653801e Apr 16 13:19:54.355576 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:54.355501 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" event={"ID":"1e8ef80b-df28-4c75-b3be-6327f30d026d","Type":"ContainerStarted","Data":"99617e6b7b8ba6eb369d999c311d69029e4b5b14b1fdf672c8bc2e0b4653801e"} Apr 16 13:19:54.357277 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:54.357251 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" event={"ID":"790bdf2b-3a23-4688-9cd0-bffc63aae543","Type":"ContainerStarted","Data":"dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206"} Apr 16 13:19:54.357455 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:54.357430 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:19:54.393148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:54.393103 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" podStartSLOduration=1.6485811799999999 podStartE2EDuration="6.39308962s" podCreationTimestamp="2026-04-16 13:19:48 +0000 UTC" firstStartedPulling="2026-04-16 13:19:49.076998753 +0000 UTC m=+493.959354824" lastFinishedPulling="2026-04-16 13:19:53.821507194 +0000 UTC m=+498.703863264" observedRunningTime="2026-04-16 13:19:54.391805179 +0000 UTC m=+499.274161268" watchObservedRunningTime="2026-04-16 13:19:54.39308962 +0000 UTC m=+499.275445703" Apr 16 13:19:56.033108 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.033076 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd"] Apr 16 13:19:56.035691 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.035673 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:19:56.038117 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.038095 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-m25j9\"" Apr 16 13:19:56.052989 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.050285 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd"] Apr 16 13:19:56.089253 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.089228 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lz8\" (UniqueName: \"kubernetes.io/projected/b173a6b9-b745-4d9f-9ea3-148b5e7f012f-kube-api-access-w5lz8\") pod \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" (UID: \"b173a6b9-b745-4d9f-9ea3-148b5e7f012f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:19:56.190261 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.190230 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lz8\" (UniqueName: \"kubernetes.io/projected/b173a6b9-b745-4d9f-9ea3-148b5e7f012f-kube-api-access-w5lz8\") pod \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" (UID: \"b173a6b9-b745-4d9f-9ea3-148b5e7f012f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:19:56.199085 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.199051 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lz8\" (UniqueName: \"kubernetes.io/projected/b173a6b9-b745-4d9f-9ea3-148b5e7f012f-kube-api-access-w5lz8\") pod \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" (UID: \"b173a6b9-b745-4d9f-9ea3-148b5e7f012f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:19:56.354568 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.354542 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:19:56.368777 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.368731 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" event={"ID":"1e8ef80b-df28-4c75-b3be-6327f30d026d","Type":"ContainerStarted","Data":"4e57fb679fbb9b4956ed2922f364275e7ff8d36f3758caa6489523a531f3348d"} Apr 16 13:19:56.369025 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.368986 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:19:56.391647 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.391536 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" podStartSLOduration=1.225157109 podStartE2EDuration="3.391521355s" podCreationTimestamp="2026-04-16 13:19:53 +0000 UTC" firstStartedPulling="2026-04-16 13:19:54.118552539 +0000 UTC m=+499.000908607" lastFinishedPulling="2026-04-16 13:19:56.284916782 +0000 UTC m=+501.167272853" observedRunningTime="2026-04-16 13:19:56.390651232 +0000 UTC m=+501.273007322" watchObservedRunningTime="2026-04-16 13:19:56.391521355 +0000 UTC m=+501.273877489" Apr 16 13:19:56.507568 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:56.507542 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd"] Apr 16 13:19:56.510077 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:19:56.510050 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb173a6b9_b745_4d9f_9ea3_148b5e7f012f.slice/crio-c7442ee9aa1e59bea654e134622fd31422ccd7c588b4ad95db3954c17db14be1 WatchSource:0}: Error finding container c7442ee9aa1e59bea654e134622fd31422ccd7c588b4ad95db3954c17db14be1: Status 404 returned error can't find the container with id c7442ee9aa1e59bea654e134622fd31422ccd7c588b4ad95db3954c17db14be1 Apr 16 13:19:57.373797 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:57.373705 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" event={"ID":"b173a6b9-b745-4d9f-9ea3-148b5e7f012f","Type":"ContainerStarted","Data":"c7442ee9aa1e59bea654e134622fd31422ccd7c588b4ad95db3954c17db14be1"} Apr 16 13:19:58.378698 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:58.378670 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" event={"ID":"b173a6b9-b745-4d9f-9ea3-148b5e7f012f","Type":"ContainerStarted","Data":"600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7"} Apr 16 13:19:58.379036 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:58.378766 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:19:58.395729 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:19:58.395681 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" podStartSLOduration=0.653421773 podStartE2EDuration="2.395668371s" podCreationTimestamp="2026-04-16 13:19:56 +0000 UTC" firstStartedPulling="2026-04-16 13:19:56.512477948 +0000 UTC m=+501.394834016" lastFinishedPulling="2026-04-16 13:19:58.254724547 +0000 UTC m=+503.137080614" observedRunningTime="2026-04-16 13:19:58.39317298 +0000 UTC m=+503.275529069" watchObservedRunningTime="2026-04-16 13:19:58.395668371 +0000 UTC m=+503.278024461" Apr 16 13:20:05.365340 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:05.365306 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:20:06.409040 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.409009 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27"] Apr 16 13:20:06.409760 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.409725 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" containerName="manager" containerID="cri-o://dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206" gracePeriod=2 Apr 16 13:20:06.415721 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.415697 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27"] Apr 16 13:20:06.428197 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.428172 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd"] Apr 16 13:20:06.428494 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.428458 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" containerName="manager" containerID="cri-o://600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7" gracePeriod=2 Apr 16 13:20:06.430593 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.430571 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:20:06.431821 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.431753 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2"] Apr 16 13:20:06.432204 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.432186 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" containerName="manager" Apr 16 13:20:06.432283 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.432207 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" containerName="manager" Apr 16 13:20:06.432334 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.432303 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" containerName="manager" Apr 16 13:20:06.434319 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.434302 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.441949 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.441916 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.444402 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.444374 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.445473 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.445448 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd"] Apr 16 13:20:06.447053 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.447031 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2"] Apr 16 13:20:06.457194 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.457174 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6"] Apr 16 13:20:06.457489 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.457476 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" containerName="manager" Apr 16 13:20:06.457557 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.457490 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" containerName="manager" Apr 16 13:20:06.457611 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.457562 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" containerName="manager" Apr 16 13:20:06.459689 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.459672 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:06.472792 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.472767 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6"] Apr 16 13:20:06.475909 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.475882 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.497353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.497328 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.499012 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.498990 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.577088 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.577055 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p99t2\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.577262 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.577118 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s4h55\" (UniqueName: \"kubernetes.io/projected/d6482a12-0918-4bfb-a033-bea250f1c21f-kube-api-access-s4h55\") pod \"limitador-operator-controller-manager-85c4996f8c-7tzl6\" (UID: \"d6482a12-0918-4bfb-a033-bea250f1c21f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:06.577336 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.577257 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mxjr9\" (UniqueName: \"kubernetes.io/projected/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-kube-api-access-mxjr9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p99t2\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.658003 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.657984 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:20:06.660309 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.660251 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.660924 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.660910 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:20:06.662461 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.662436 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.664392 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.664373 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.666207 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.666187 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:06.678629 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.678606 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mxjr9\" (UniqueName: \"kubernetes.io/projected/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-kube-api-access-mxjr9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p99t2\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.678711 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.678643 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p99t2\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.678711 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.678690 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s4h55\" (UniqueName: \"kubernetes.io/projected/d6482a12-0918-4bfb-a033-bea250f1c21f-kube-api-access-s4h55\") pod \"limitador-operator-controller-manager-85c4996f8c-7tzl6\" (UID: \"d6482a12-0918-4bfb-a033-bea250f1c21f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:06.679083 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.679066 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p99t2\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.687569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.687544 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mxjr9\" (UniqueName: \"kubernetes.io/projected/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-kube-api-access-mxjr9\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-p99t2\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.687569 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.687558 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s4h55\" (UniqueName: \"kubernetes.io/projected/d6482a12-0918-4bfb-a033-bea250f1c21f-kube-api-access-s4h55\") pod \"limitador-operator-controller-manager-85c4996f8c-7tzl6\" (UID: \"d6482a12-0918-4bfb-a033-bea250f1c21f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:06.778959 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.778936 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w5lz8\" (UniqueName: \"kubernetes.io/projected/b173a6b9-b745-4d9f-9ea3-148b5e7f012f-kube-api-access-w5lz8\") pod \"b173a6b9-b745-4d9f-9ea3-148b5e7f012f\" (UID: \"b173a6b9-b745-4d9f-9ea3-148b5e7f012f\") " Apr 16 13:20:06.779059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.779015 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j8nd7\" (UniqueName: \"kubernetes.io/projected/790bdf2b-3a23-4688-9cd0-bffc63aae543-kube-api-access-j8nd7\") pod \"790bdf2b-3a23-4688-9cd0-bffc63aae543\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " Apr 16 13:20:06.779059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.779048 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/790bdf2b-3a23-4688-9cd0-bffc63aae543-extensions-socket-volume\") pod \"790bdf2b-3a23-4688-9cd0-bffc63aae543\" (UID: \"790bdf2b-3a23-4688-9cd0-bffc63aae543\") " Apr 16 13:20:06.780958 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.779852 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/790bdf2b-3a23-4688-9cd0-bffc63aae543-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "790bdf2b-3a23-4688-9cd0-bffc63aae543" (UID: "790bdf2b-3a23-4688-9cd0-bffc63aae543"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:20:06.783909 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.782267 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/790bdf2b-3a23-4688-9cd0-bffc63aae543-kube-api-access-j8nd7" (OuterVolumeSpecName: "kube-api-access-j8nd7") pod "790bdf2b-3a23-4688-9cd0-bffc63aae543" (UID: "790bdf2b-3a23-4688-9cd0-bffc63aae543"). InnerVolumeSpecName "kube-api-access-j8nd7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:20:06.783909 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.782539 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b173a6b9-b745-4d9f-9ea3-148b5e7f012f-kube-api-access-w5lz8" (OuterVolumeSpecName: "kube-api-access-w5lz8") pod "b173a6b9-b745-4d9f-9ea3-148b5e7f012f" (UID: "b173a6b9-b745-4d9f-9ea3-148b5e7f012f"). InnerVolumeSpecName "kube-api-access-w5lz8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:20:06.845364 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.845341 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:06.851929 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.851912 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:06.880286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.880256 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w5lz8\" (UniqueName: \"kubernetes.io/projected/b173a6b9-b745-4d9f-9ea3-148b5e7f012f-kube-api-access-w5lz8\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:20:06.880286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.880282 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j8nd7\" (UniqueName: \"kubernetes.io/projected/790bdf2b-3a23-4688-9cd0-bffc63aae543-kube-api-access-j8nd7\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:20:06.880286 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.880292 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/790bdf2b-3a23-4688-9cd0-bffc63aae543-extensions-socket-volume\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:20:06.977308 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.977253 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2"] Apr 16 13:20:06.979971 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:20:06.979934 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode77c3224_e091_4e7d_9386_e8a3eccd3ce5.slice/crio-39f549acb9a6a528fcfc490876a1ad8b2dc9f7b23986f9bb361b7fecb1e056c8 WatchSource:0}: Error finding container 39f549acb9a6a528fcfc490876a1ad8b2dc9f7b23986f9bb361b7fecb1e056c8: Status 404 returned error can't find the container with id 39f549acb9a6a528fcfc490876a1ad8b2dc9f7b23986f9bb361b7fecb1e056c8 Apr 16 13:20:06.993892 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:06.993805 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6"] Apr 16 13:20:06.996944 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:20:06.996915 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6482a12_0918_4bfb_a033_bea250f1c21f.slice/crio-034ca7b107f063a6ec3f9836f386a016c9a51ee570672e3bd90b96a223743e66 WatchSource:0}: Error finding container 034ca7b107f063a6ec3f9836f386a016c9a51ee570672e3bd90b96a223743e66: Status 404 returned error can't find the container with id 034ca7b107f063a6ec3f9836f386a016c9a51ee570672e3bd90b96a223743e66 Apr 16 13:20:07.376771 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.376745 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-njk8v" Apr 16 13:20:07.379366 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.379338 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.406702 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.406671 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.410517 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.410491 2571 generic.go:358] "Generic (PLEG): container finished" podID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" containerID="600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7" exitCode=0 Apr 16 13:20:07.410906 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.410542 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" Apr 16 13:20:07.410906 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.410582 2571 scope.go:117] "RemoveContainer" containerID="600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7" Apr 16 13:20:07.412330 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.412306 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" event={"ID":"e77c3224-e091-4e7d-9386-e8a3eccd3ce5","Type":"ContainerStarted","Data":"8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9"} Apr 16 13:20:07.412440 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.412337 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" event={"ID":"e77c3224-e091-4e7d-9386-e8a3eccd3ce5","Type":"ContainerStarted","Data":"39f549acb9a6a528fcfc490876a1ad8b2dc9f7b23986f9bb361b7fecb1e056c8"} Apr 16 13:20:07.412440 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.412412 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:07.412978 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.412951 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.413570 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.413547 2571 generic.go:358] "Generic (PLEG): container finished" podID="790bdf2b-3a23-4688-9cd0-bffc63aae543" containerID="dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206" exitCode=0 Apr 16 13:20:07.413651 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.413595 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" Apr 16 13:20:07.415151 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.415122 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.415264 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.415196 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" event={"ID":"d6482a12-0918-4bfb-a033-bea250f1c21f","Type":"ContainerStarted","Data":"852c77882752d411f5fc9443d8c3d4d0b568598266208f02970f2e9354ff2e58"} Apr 16 13:20:07.415264 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.415218 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" event={"ID":"d6482a12-0918-4bfb-a033-bea250f1c21f","Type":"ContainerStarted","Data":"034ca7b107f063a6ec3f9836f386a016c9a51ee570672e3bd90b96a223743e66"} Apr 16 13:20:07.415374 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.415308 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:07.417925 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.417892 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.421469 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.421453 2571 scope.go:117] "RemoveContainer" containerID="600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7" Apr 16 13:20:07.421972 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:20:07.421944 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7\": container with ID starting with 600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7 not found: ID does not exist" containerID="600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7" Apr 16 13:20:07.422755 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.421983 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7"} err="failed to get container status \"600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7\": rpc error: code = NotFound desc = could not find container \"600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7\": container with ID starting with 600e799d1beff2a5d8ef35ffe91c8e7df8f6c2d334a871cc93a30b3c869767e7 not found: ID does not exist" Apr 16 13:20:07.422755 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.422007 2571 scope.go:117] "RemoveContainer" containerID="dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206" Apr 16 13:20:07.429119 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.429097 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd"] Apr 16 13:20:07.431808 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.431778 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.444928 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.444861 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd"] Apr 16 13:20:07.450131 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.450078 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" podStartSLOduration=1.45006154 podStartE2EDuration="1.45006154s" podCreationTimestamp="2026-04-16 13:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:20:07.448755169 +0000 UTC m=+512.331111261" watchObservedRunningTime="2026-04-16 13:20:07.45006154 +0000 UTC m=+512.332417655" Apr 16 13:20:07.450832 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.450811 2571 scope.go:117] "RemoveContainer" containerID="dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206" Apr 16 13:20:07.451206 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:20:07.451181 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206\": container with ID starting with dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206 not found: ID does not exist" containerID="dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206" Apr 16 13:20:07.451206 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.451192 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.451358 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.451214 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206"} err="failed to get container status \"dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206\": rpc error: code = NotFound desc = could not find container \"dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206\": container with ID starting with dc911eeb3eb5196b579f42d295941944c12dc81384287b214a2a76363965a206 not found: ID does not exist" Apr 16 13:20:07.457171 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.457144 2571 status_manager.go:895] "Failed to get status for pod" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-d2w27" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-d2w27\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.485799 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.485776 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-d27wd\" (UID: \"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.485916 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.485808 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md9wn\" (UniqueName: \"kubernetes.io/projected/6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d-kube-api-access-md9wn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-d27wd\" (UID: \"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.497639 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.497604 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" podStartSLOduration=1.497593528 podStartE2EDuration="1.497593528s" podCreationTimestamp="2026-04-16 13:20:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:20:07.496261414 +0000 UTC m=+512.378617516" watchObservedRunningTime="2026-04-16 13:20:07.497593528 +0000 UTC m=+512.379949618" Apr 16 13:20:07.498364 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.498339 2571 status_manager.go:895] "Failed to get status for pod" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-vlvtd" err="pods \"limitador-operator-controller-manager-85c4996f8c-vlvtd\" is forbidden: User \"system:node:ip-10-0-141-234.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-141-234.ec2.internal' and this object" Apr 16 13:20:07.586749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.586725 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-d27wd\" (UID: \"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.586885 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.586764 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-md9wn\" (UniqueName: \"kubernetes.io/projected/6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d-kube-api-access-md9wn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-d27wd\" (UID: \"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.587103 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.587083 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-d27wd\" (UID: \"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.595335 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.595314 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-md9wn\" (UniqueName: \"kubernetes.io/projected/6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d-kube-api-access-md9wn\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-d27wd\" (UID: \"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.724593 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.724563 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="790bdf2b-3a23-4688-9cd0-bffc63aae543" path="/var/lib/kubelet/pods/790bdf2b-3a23-4688-9cd0-bffc63aae543/volumes" Apr 16 13:20:07.725001 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.724980 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b173a6b9-b745-4d9f-9ea3-148b5e7f012f" path="/var/lib/kubelet/pods/b173a6b9-b745-4d9f-9ea3-148b5e7f012f/volumes" Apr 16 13:20:07.758391 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.758361 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:07.889042 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:07.889012 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd"] Apr 16 13:20:07.891839 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:20:07.891813 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6fcf5d2a_f8a7_4d2a_929b_cf4a2923f09d.slice/crio-287971ca48f8f87ccf94eb9a561c7c76690baf62969a098053c928b0b2abc190 WatchSource:0}: Error finding container 287971ca48f8f87ccf94eb9a561c7c76690baf62969a098053c928b0b2abc190: Status 404 returned error can't find the container with id 287971ca48f8f87ccf94eb9a561c7c76690baf62969a098053c928b0b2abc190 Apr 16 13:20:08.419522 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:08.419482 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" event={"ID":"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d","Type":"ContainerStarted","Data":"95808bba89d115776f31be345b0fb8aa63a1212ef86f90894357eca54af9e620"} Apr 16 13:20:08.419522 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:08.419522 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" event={"ID":"6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d","Type":"ContainerStarted","Data":"287971ca48f8f87ccf94eb9a561c7c76690baf62969a098053c928b0b2abc190"} Apr 16 13:20:08.420043 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:08.419557 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:08.443819 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:08.443765 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" podStartSLOduration=1.443752187 podStartE2EDuration="1.443752187s" podCreationTimestamp="2026-04-16 13:20:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:20:08.4423632 +0000 UTC m=+513.324719292" watchObservedRunningTime="2026-04-16 13:20:08.443752187 +0000 UTC m=+513.326108276" Apr 16 13:20:18.423098 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:18.423021 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:18.423444 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:18.423424 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-7tzl6" Apr 16 13:20:19.427272 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.427248 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-d27wd" Apr 16 13:20:19.477796 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.477764 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2"] Apr 16 13:20:19.478034 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.478009 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" podUID="e77c3224-e091-4e7d-9386-e8a3eccd3ce5" containerName="manager" containerID="cri-o://8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9" gracePeriod=10 Apr 16 13:20:19.717490 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.717467 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:19.787181 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.787152 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-extensions-socket-volume\") pod \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " Apr 16 13:20:19.787298 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.787248 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mxjr9\" (UniqueName: \"kubernetes.io/projected/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-kube-api-access-mxjr9\") pod \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\" (UID: \"e77c3224-e091-4e7d-9386-e8a3eccd3ce5\") " Apr 16 13:20:19.787578 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.787551 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "e77c3224-e091-4e7d-9386-e8a3eccd3ce5" (UID: "e77c3224-e091-4e7d-9386-e8a3eccd3ce5"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 13:20:19.789139 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.789117 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-kube-api-access-mxjr9" (OuterVolumeSpecName: "kube-api-access-mxjr9") pod "e77c3224-e091-4e7d-9386-e8a3eccd3ce5" (UID: "e77c3224-e091-4e7d-9386-e8a3eccd3ce5"). InnerVolumeSpecName "kube-api-access-mxjr9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:20:19.888700 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.888674 2571 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-extensions-socket-volume\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:20:19.888700 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:19.888696 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mxjr9\" (UniqueName: \"kubernetes.io/projected/e77c3224-e091-4e7d-9386-e8a3eccd3ce5-kube-api-access-mxjr9\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:20:20.462754 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.462720 2571 generic.go:358] "Generic (PLEG): container finished" podID="e77c3224-e091-4e7d-9386-e8a3eccd3ce5" containerID="8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9" exitCode=0 Apr 16 13:20:20.463227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.462783 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" Apr 16 13:20:20.463227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.462797 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" event={"ID":"e77c3224-e091-4e7d-9386-e8a3eccd3ce5","Type":"ContainerDied","Data":"8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9"} Apr 16 13:20:20.463227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.462832 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2" event={"ID":"e77c3224-e091-4e7d-9386-e8a3eccd3ce5","Type":"ContainerDied","Data":"39f549acb9a6a528fcfc490876a1ad8b2dc9f7b23986f9bb361b7fecb1e056c8"} Apr 16 13:20:20.463227 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.462847 2571 scope.go:117] "RemoveContainer" containerID="8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9" Apr 16 13:20:20.471546 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.471527 2571 scope.go:117] "RemoveContainer" containerID="8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9" Apr 16 13:20:20.471817 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:20:20.471798 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9\": container with ID starting with 8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9 not found: ID does not exist" containerID="8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9" Apr 16 13:20:20.471896 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.471829 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9"} err="failed to get container status \"8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9\": rpc error: code = NotFound desc = could not find container \"8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9\": container with ID starting with 8bb2fc7f6df7263c61a141f9d616d41a24bbc858aa7b67b111ea17387e5afdb9 not found: ID does not exist" Apr 16 13:20:20.486787 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.486760 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2"] Apr 16 13:20:20.492717 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:20.492693 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-p99t2"] Apr 16 13:20:21.724609 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:21.724571 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e77c3224-e091-4e7d-9386-e8a3eccd3ce5" path="/var/lib/kubelet/pods/e77c3224-e091-4e7d-9386-e8a3eccd3ce5/volumes" Apr 16 13:20:35.732124 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.732094 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5"] Apr 16 13:20:35.732489 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.732436 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e77c3224-e091-4e7d-9386-e8a3eccd3ce5" containerName="manager" Apr 16 13:20:35.732489 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.732447 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="e77c3224-e091-4e7d-9386-e8a3eccd3ce5" containerName="manager" Apr 16 13:20:35.732560 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.732504 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="e77c3224-e091-4e7d-9386-e8a3eccd3ce5" containerName="manager" Apr 16 13:20:35.739079 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.739058 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.741971 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.741946 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-mjgmm\"" Apr 16 13:20:35.751069 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.751046 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5"] Apr 16 13:20:35.819818 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.819786 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.819966 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.819826 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/074f941c-a2f8-43bd-b264-706a6ceb4802-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.819966 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.819936 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.820069 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.819966 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.820069 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.820009 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.820168 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.820079 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.820217 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.820159 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.820284 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.820249 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.820339 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.820289 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzjqm\" (UniqueName: \"kubernetes.io/projected/074f941c-a2f8-43bd-b264-706a6ceb4802-kube-api-access-nzjqm\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921521 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921452 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921521 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921493 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921521 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921518 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921548 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921573 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nzjqm\" (UniqueName: \"kubernetes.io/projected/074f941c-a2f8-43bd-b264-706a6ceb4802-kube-api-access-nzjqm\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921603 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921618 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/074f941c-a2f8-43bd-b264-706a6ceb4802-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921981 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921893 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-workload-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.921981 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921951 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-data\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.922084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921649 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.922084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.921984 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-workload-certs\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.922084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.922017 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.922084 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.922049 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-credential-socket\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.922391 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.922371 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/074f941c-a2f8-43bd-b264-706a6ceb4802-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.923999 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.923981 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-podinfo\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.924148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.924130 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-envoy\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.929861 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.929841 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/074f941c-a2f8-43bd-b264-706a6ceb4802-istio-token\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:35.930086 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:35.930070 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzjqm\" (UniqueName: \"kubernetes.io/projected/074f941c-a2f8-43bd-b264-706a6ceb4802-kube-api-access-nzjqm\") pod \"maas-default-gateway-openshift-default-58b6f876-prpq5\" (UID: \"074f941c-a2f8-43bd-b264-706a6ceb4802\") " pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:36.054025 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.054004 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:36.175932 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.175903 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5"] Apr 16 13:20:36.177848 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:20:36.177820 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod074f941c_a2f8_43bd_b264_706a6ceb4802.slice/crio-9865fea7deaeb866d6e109f8218ff16dbf8991ced3d58694b4b5cd09d7ddd21c WatchSource:0}: Error finding container 9865fea7deaeb866d6e109f8218ff16dbf8991ced3d58694b4b5cd09d7ddd21c: Status 404 returned error can't find the container with id 9865fea7deaeb866d6e109f8218ff16dbf8991ced3d58694b4b5cd09d7ddd21c Apr 16 13:20:36.180114 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.180077 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:20:36.180200 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.180138 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:20:36.180200 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.180166 2571 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 16 13:20:36.520353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.520323 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" event={"ID":"074f941c-a2f8-43bd-b264-706a6ceb4802","Type":"ContainerStarted","Data":"bb913ef5650b2033c911b4e4e6cb771a0a49776713a46e6d4b103667d4354f88"} Apr 16 13:20:36.520353 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.520357 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" event={"ID":"074f941c-a2f8-43bd-b264-706a6ceb4802","Type":"ContainerStarted","Data":"9865fea7deaeb866d6e109f8218ff16dbf8991ced3d58694b4b5cd09d7ddd21c"} Apr 16 13:20:36.545688 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:36.545644 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" podStartSLOduration=1.545630829 podStartE2EDuration="1.545630829s" podCreationTimestamp="2026-04-16 13:20:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:20:36.544577201 +0000 UTC m=+541.426933328" watchObservedRunningTime="2026-04-16 13:20:36.545630829 +0000 UTC m=+541.427986919" Apr 16 13:20:37.054726 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:37.054697 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:37.059508 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:37.059483 2571 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:37.523613 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:37.523581 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:37.524453 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:37.524434 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-58b6f876-prpq5" Apr 16 13:20:40.677342 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.677309 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:20:40.680732 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.680712 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.683163 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.683146 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 13:20:40.683260 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.683168 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-b5hqc\"" Apr 16 13:20:40.690099 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.690081 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:20:40.711266 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.711241 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:20:40.758207 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.758184 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43738a75-f09d-4eaf-8a1c-5ba422c1a531-config-file\") pod \"limitador-limitador-78c99df468-jxr7j\" (UID: \"43738a75-f09d-4eaf-8a1c-5ba422c1a531\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.758306 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.758216 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlzl5\" (UniqueName: \"kubernetes.io/projected/43738a75-f09d-4eaf-8a1c-5ba422c1a531-kube-api-access-wlzl5\") pod \"limitador-limitador-78c99df468-jxr7j\" (UID: \"43738a75-f09d-4eaf-8a1c-5ba422c1a531\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.859495 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.859471 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43738a75-f09d-4eaf-8a1c-5ba422c1a531-config-file\") pod \"limitador-limitador-78c99df468-jxr7j\" (UID: \"43738a75-f09d-4eaf-8a1c-5ba422c1a531\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.859586 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.859507 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wlzl5\" (UniqueName: \"kubernetes.io/projected/43738a75-f09d-4eaf-8a1c-5ba422c1a531-kube-api-access-wlzl5\") pod \"limitador-limitador-78c99df468-jxr7j\" (UID: \"43738a75-f09d-4eaf-8a1c-5ba422c1a531\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.860070 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.860053 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/43738a75-f09d-4eaf-8a1c-5ba422c1a531-config-file\") pod \"limitador-limitador-78c99df468-jxr7j\" (UID: \"43738a75-f09d-4eaf-8a1c-5ba422c1a531\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.867110 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.867086 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlzl5\" (UniqueName: \"kubernetes.io/projected/43738a75-f09d-4eaf-8a1c-5ba422c1a531-kube-api-access-wlzl5\") pod \"limitador-limitador-78c99df468-jxr7j\" (UID: \"43738a75-f09d-4eaf-8a1c-5ba422c1a531\") " pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:40.992713 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:40.992687 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:41.323110 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:41.323084 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:20:41.325311 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:20:41.325280 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43738a75_f09d_4eaf_8a1c_5ba422c1a531.slice/crio-c07153f8bd91e7766fdc655ea2b93d97ae15e7ec103566f233decb8d5757e319 WatchSource:0}: Error finding container c07153f8bd91e7766fdc655ea2b93d97ae15e7ec103566f233decb8d5757e319: Status 404 returned error can't find the container with id c07153f8bd91e7766fdc655ea2b93d97ae15e7ec103566f233decb8d5757e319 Apr 16 13:20:41.538904 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:41.538853 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" event={"ID":"43738a75-f09d-4eaf-8a1c-5ba422c1a531","Type":"ContainerStarted","Data":"c07153f8bd91e7766fdc655ea2b93d97ae15e7ec103566f233decb8d5757e319"} Apr 16 13:20:44.550645 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:44.550607 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" event={"ID":"43738a75-f09d-4eaf-8a1c-5ba422c1a531","Type":"ContainerStarted","Data":"90c4592ec5a3abaaf26258bd97c49d334f283c8f68a75e32aa7e33743a53388c"} Apr 16 13:20:44.551112 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:44.550673 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:20:44.566224 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:44.566160 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" podStartSLOduration=2.141701603 podStartE2EDuration="4.566145099s" podCreationTimestamp="2026-04-16 13:20:40 +0000 UTC" firstStartedPulling="2026-04-16 13:20:41.327138348 +0000 UTC m=+546.209494417" lastFinishedPulling="2026-04-16 13:20:43.751581831 +0000 UTC m=+548.633937913" observedRunningTime="2026-04-16 13:20:44.565273176 +0000 UTC m=+549.447629266" watchObservedRunningTime="2026-04-16 13:20:44.566145099 +0000 UTC m=+549.448501192" Apr 16 13:20:55.557395 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:20:55.557363 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-78c99df468-jxr7j" Apr 16 13:21:35.635134 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:21:35.635105 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:21:35.635606 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:21:35.635567 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:22:08.155878 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:22:08.155841 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:23:28.803806 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:28.803726 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:23:39.395062 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:39.395027 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:23:41.695981 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:41.695943 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:23:46.770029 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.770002 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg"] Apr 16 13:23:46.773757 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.773736 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.777405 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.777385 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-br65d\"" Apr 16 13:23:46.777481 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.777410 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 16 13:23:46.777481 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.777446 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 16 13:23:46.777560 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.777507 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 16 13:23:46.782161 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.782139 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg"] Apr 16 13:23:46.849304 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.849268 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.849467 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.849322 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.849467 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.849402 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.849467 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.849449 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.849612 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.849470 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7xnq\" (UniqueName: \"kubernetes.io/projected/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-kube-api-access-b7xnq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.849612 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.849531 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.950832 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.950801 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951023 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.950850 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951023 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.950902 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951023 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.950933 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951023 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.950952 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7xnq\" (UniqueName: \"kubernetes.io/projected/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-kube-api-access-b7xnq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951023 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.950970 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951346 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.951313 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951472 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.951355 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.951472 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.951410 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.953047 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.953028 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.953333 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.953317 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:46.963768 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:46.963742 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7xnq\" (UniqueName: \"kubernetes.io/projected/065e11ee-248e-463c-9f4a-e50b5fc2a5bb-kube-api-access-b7xnq\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg\" (UID: \"065e11ee-248e-463c-9f4a-e50b5fc2a5bb\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:47.085453 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:47.085383 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:23:47.213333 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:47.213308 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg"] Apr 16 13:23:47.214734 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:23:47.214706 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod065e11ee_248e_463c_9f4a_e50b5fc2a5bb.slice/crio-8628aa20a7306db91da265d668b6407d4d0ba14f4b17b92735670631126ebc85 WatchSource:0}: Error finding container 8628aa20a7306db91da265d668b6407d4d0ba14f4b17b92735670631126ebc85: Status 404 returned error can't find the container with id 8628aa20a7306db91da265d668b6407d4d0ba14f4b17b92735670631126ebc85 Apr 16 13:23:47.216422 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:47.216407 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:23:47.603059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:47.603031 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:23:48.172474 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:48.172436 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" event={"ID":"065e11ee-248e-463c-9f4a-e50b5fc2a5bb","Type":"ContainerStarted","Data":"8628aa20a7306db91da265d668b6407d4d0ba14f4b17b92735670631126ebc85"} Apr 16 13:23:53.193065 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:53.193025 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" event={"ID":"065e11ee-248e-463c-9f4a-e50b5fc2a5bb","Type":"ContainerStarted","Data":"5c8df0e7500c9db98360fc43a4664e69ed3fd2f29a6cb7cda97c1efd0c88d059"} Apr 16 13:23:58.213269 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:58.213236 2571 generic.go:358] "Generic (PLEG): container finished" podID="065e11ee-248e-463c-9f4a-e50b5fc2a5bb" containerID="5c8df0e7500c9db98360fc43a4664e69ed3fd2f29a6cb7cda97c1efd0c88d059" exitCode=0 Apr 16 13:23:58.213683 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:23:58.213308 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" event={"ID":"065e11ee-248e-463c-9f4a-e50b5fc2a5bb","Type":"ContainerDied","Data":"5c8df0e7500c9db98360fc43a4664e69ed3fd2f29a6cb7cda97c1efd0c88d059"} Apr 16 13:24:03.233814 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:24:03.233782 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" event={"ID":"065e11ee-248e-463c-9f4a-e50b5fc2a5bb","Type":"ContainerStarted","Data":"273ae281d9c92b429ed17e2416add7a737dd92be2cd5f53b00824fd53e1047ff"} Apr 16 13:24:03.234201 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:24:03.234034 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:24:03.251277 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:24:03.251219 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" podStartSLOduration=1.441882087 podStartE2EDuration="17.251202077s" podCreationTimestamp="2026-04-16 13:23:46 +0000 UTC" firstStartedPulling="2026-04-16 13:23:47.216531097 +0000 UTC m=+732.098887166" lastFinishedPulling="2026-04-16 13:24:03.02585105 +0000 UTC m=+747.908207156" observedRunningTime="2026-04-16 13:24:03.250164774 +0000 UTC m=+748.132520864" watchObservedRunningTime="2026-04-16 13:24:03.251202077 +0000 UTC m=+748.133558168" Apr 16 13:24:14.255099 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:24:14.255071 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg" Apr 16 13:24:39.289458 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:24:39.289422 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:25:33.990724 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:25:33.990692 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:25:42.389728 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:25:42.389693 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:26:12.689016 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:26:12.688980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:26:28.399504 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:26:28.399424 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:26:35.659689 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:26:35.659658 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:26:35.660643 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:26:35.660625 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:27:05.992338 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:27:05.992303 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:27:22.987876 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:27:22.987830 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:27:36.896499 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:27:36.896464 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:27:53.687898 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:27:53.687798 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:28:55.484992 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:28:55.484962 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:29:04.996889 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:29:04.994161 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:29:21.489251 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:29:21.489166 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:29:29.990208 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:29:29.990171 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:29:46.398032 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:29:46.398002 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:29:54.786672 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:29:54.786630 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:30:00.140495 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.140451 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29605770-6zbg8"] Apr 16 13:30:00.144405 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.144387 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:30:00.146850 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.146830 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-dkfvz\"" Apr 16 13:30:00.157243 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.157219 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605770-6zbg8"] Apr 16 13:30:00.186495 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.186467 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w965l\" (UniqueName: \"kubernetes.io/projected/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2-kube-api-access-w965l\") pod \"maas-api-key-cleanup-29605770-6zbg8\" (UID: \"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2\") " pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:30:00.287287 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.287259 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w965l\" (UniqueName: \"kubernetes.io/projected/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2-kube-api-access-w965l\") pod \"maas-api-key-cleanup-29605770-6zbg8\" (UID: \"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2\") " pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:30:00.296573 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.296555 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w965l\" (UniqueName: \"kubernetes.io/projected/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2-kube-api-access-w965l\") pod \"maas-api-key-cleanup-29605770-6zbg8\" (UID: \"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2\") " pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:30:00.454679 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.454651 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:30:00.580067 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.580043 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605770-6zbg8"] Apr 16 13:30:00.581951 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:30:00.581920 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dde31cd_64eb_4faf_871f_2f4f3a49c7a2.slice/crio-cfcb7236af3643667f780ded18934d06229b2d170a49b49ed74a866c7be9af33 WatchSource:0}: Error finding container cfcb7236af3643667f780ded18934d06229b2d170a49b49ed74a866c7be9af33: Status 404 returned error can't find the container with id cfcb7236af3643667f780ded18934d06229b2d170a49b49ed74a866c7be9af33 Apr 16 13:30:00.584077 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:00.584060 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:30:01.485241 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:01.485205 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerStarted","Data":"cfcb7236af3643667f780ded18934d06229b2d170a49b49ed74a866c7be9af33"} Apr 16 13:30:03.493549 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:03.493515 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerStarted","Data":"8140ed1f952f753663ef6c5ef1aaf076ea38d86d88db292ff31eea4e2d4b2859"} Apr 16 13:30:03.509356 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:03.509314 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" podStartSLOduration=0.974166786 podStartE2EDuration="3.509300364s" podCreationTimestamp="2026-04-16 13:30:00 +0000 UTC" firstStartedPulling="2026-04-16 13:30:00.584211639 +0000 UTC m=+1105.466567710" lastFinishedPulling="2026-04-16 13:30:03.119345216 +0000 UTC m=+1108.001701288" observedRunningTime="2026-04-16 13:30:03.507149578 +0000 UTC m=+1108.389505680" watchObservedRunningTime="2026-04-16 13:30:03.509300364 +0000 UTC m=+1108.391656445" Apr 16 13:30:24.572910 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:24.572850 2571 generic.go:358] "Generic (PLEG): container finished" podID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerID="8140ed1f952f753663ef6c5ef1aaf076ea38d86d88db292ff31eea4e2d4b2859" exitCode=6 Apr 16 13:30:24.573366 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:24.572970 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerDied","Data":"8140ed1f952f753663ef6c5ef1aaf076ea38d86d88db292ff31eea4e2d4b2859"} Apr 16 13:30:24.573366 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:24.573334 2571 scope.go:117] "RemoveContainer" containerID="8140ed1f952f753663ef6c5ef1aaf076ea38d86d88db292ff31eea4e2d4b2859" Apr 16 13:30:25.577681 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:25.577651 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerStarted","Data":"8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90"} Apr 16 13:30:27.994087 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:27.994053 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:30:35.989713 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:35.989678 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:30:45.292059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:45.291988 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:30:45.655105 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:45.655024 2571 generic.go:358] "Generic (PLEG): container finished" podID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerID="8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90" exitCode=6 Apr 16 13:30:45.655259 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:45.655098 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerDied","Data":"8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90"} Apr 16 13:30:45.655259 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:45.655142 2571 scope.go:117] "RemoveContainer" containerID="8140ed1f952f753663ef6c5ef1aaf076ea38d86d88db292ff31eea4e2d4b2859" Apr 16 13:30:45.655490 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:45.655473 2571 scope.go:117] "RemoveContainer" containerID="8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90" Apr 16 13:30:45.655786 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:30:45.655766 2571 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29605770-6zbg8_opendatahub(5dde31cd-64eb-4faf-871f-2f4f3a49c7a2)\"" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" Apr 16 13:30:53.391136 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:53.391093 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:30:59.720521 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:30:59.720482 2571 scope.go:117] "RemoveContainer" containerID="8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90" Apr 16 13:31:00.010323 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:00.010263 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605770-6zbg8"] Apr 16 13:31:00.708379 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:00.708345 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerStarted","Data":"ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c"} Apr 16 13:31:00.708564 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:00.708415 2571 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" containerID="cri-o://ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c" gracePeriod=30 Apr 16 13:31:01.692734 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:01.692703 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:31:18.888670 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:18.888628 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:31:20.653982 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.653959 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:31:20.780729 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.780640 2571 generic.go:358] "Generic (PLEG): container finished" podID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerID="ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c" exitCode=6 Apr 16 13:31:20.780729 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.780713 2571 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" Apr 16 13:31:20.780980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.780724 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerDied","Data":"ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c"} Apr 16 13:31:20.780980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.780760 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29605770-6zbg8" event={"ID":"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2","Type":"ContainerDied","Data":"cfcb7236af3643667f780ded18934d06229b2d170a49b49ed74a866c7be9af33"} Apr 16 13:31:20.780980 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.780776 2571 scope.go:117] "RemoveContainer" containerID="ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c" Apr 16 13:31:20.788702 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.788685 2571 scope.go:117] "RemoveContainer" containerID="8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90" Apr 16 13:31:20.792762 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.792745 2571 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w965l\" (UniqueName: \"kubernetes.io/projected/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2-kube-api-access-w965l\") pod \"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2\" (UID: \"5dde31cd-64eb-4faf-871f-2f4f3a49c7a2\") " Apr 16 13:31:20.794912 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.794887 2571 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2-kube-api-access-w965l" (OuterVolumeSpecName: "kube-api-access-w965l") pod "5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" (UID: "5dde31cd-64eb-4faf-871f-2f4f3a49c7a2"). InnerVolumeSpecName "kube-api-access-w965l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 13:31:20.795717 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.795681 2571 scope.go:117] "RemoveContainer" containerID="ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c" Apr 16 13:31:20.795954 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:31:20.795933 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c\": container with ID starting with ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c not found: ID does not exist" containerID="ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c" Apr 16 13:31:20.796037 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.795975 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c"} err="failed to get container status \"ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c\": rpc error: code = NotFound desc = could not find container \"ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c\": container with ID starting with ec108a34eee092499a4a211d313c9486e14e9c0db0058ebc6d49c36318947c7c not found: ID does not exist" Apr 16 13:31:20.796037 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.795999 2571 scope.go:117] "RemoveContainer" containerID="8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90" Apr 16 13:31:20.796249 ip-10-0-141-234 kubenswrapper[2571]: E0416 13:31:20.796229 2571 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90\": container with ID starting with 8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90 not found: ID does not exist" containerID="8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90" Apr 16 13:31:20.796293 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.796260 2571 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90"} err="failed to get container status \"8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90\": rpc error: code = NotFound desc = could not find container \"8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90\": container with ID starting with 8de8034b3cd0c99979a4858b4f07cfaf775cbd9d6fed39699555a875b44bdd90 not found: ID does not exist" Apr 16 13:31:20.894015 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:20.893988 2571 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w965l\" (UniqueName: \"kubernetes.io/projected/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2-kube-api-access-w965l\") on node \"ip-10-0-141-234.ec2.internal\" DevicePath \"\"" Apr 16 13:31:21.100387 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:21.100363 2571 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605770-6zbg8"] Apr 16 13:31:21.104200 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:21.104175 2571 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29605770-6zbg8"] Apr 16 13:31:21.724137 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:21.724109 2571 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" path="/var/lib/kubelet/pods/5dde31cd-64eb-4faf-871f-2f4f3a49c7a2/volumes" Apr 16 13:31:30.190993 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:30.190959 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:31:35.684429 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:35.684400 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:31:35.686376 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:31:35.686355 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:32:26.303735 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:32:26.303703 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:32:34.092539 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:32:34.092506 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:32:43.587148 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:32:43.587116 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:32:52.093044 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:32:52.093012 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:02.284680 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:02.284647 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:10.194943 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:10.194909 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:18.992848 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:18.988782 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:27.595281 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:27.595245 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:36.720484 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:36.720451 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:44.733404 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:44.733367 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:33:53.588815 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:33:53.588778 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:02.791195 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:02.791158 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:11.788628 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:11.788595 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:19.989390 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:19.989354 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:28.892737 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:28.892701 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:36.690659 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:36.690616 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:46.189196 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:46.189153 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:34:54.395256 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:34:54.395224 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:36:05.487365 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:36:05.487330 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:36:12.095050 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:36:12.095019 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:36:22.087152 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:36:22.087117 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:36:35.707972 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:36:35.707937 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:36:35.710490 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:36:35.710469 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:36:52.691447 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:36:52.691371 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:37:35.487387 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:37:35.487348 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:37:43.592673 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:37:43.592635 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:37:52.469825 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:37:52.469786 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:00.889205 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:00.889165 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:09.699919 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:09.699884 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:18.189088 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:18.189003 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:27.888740 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:27.888711 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:34.593190 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:34.593156 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:43.797154 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:43.797119 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:51.393906 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:51.393856 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:38:59.992315 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:38:59.992280 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:39:07.984029 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:39:07.983992 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:39:25.889861 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:39:25.889827 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:39:33.484096 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:39:33.484060 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:39:53.687657 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:39:53.687581 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:40:01.289446 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:40:01.289413 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:40:18.091581 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:40:18.091543 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:40:26.586759 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:40:26.586726 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:40:35.696053 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:40:35.696017 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:40:43.989226 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:40:43.989194 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:40:52.797551 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:40:52.797521 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:41:01.392699 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:01.392667 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:41:10.400168 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:10.400131 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:41:27.494070 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:27.494033 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:41:35.736614 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:35.736586 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:41:35.739166 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:35.739144 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:41:36.397461 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:36.397429 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:41:53.295923 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:41:53.295884 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:02.087751 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:02.087715 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:07.196226 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:07.196195 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:17.194700 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:17.194665 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:23.685788 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:23.685752 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:40.791497 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:40.791455 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:49.293231 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:49.293147 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:42:58.396113 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:42:58.396075 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:43:05.296324 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:43:05.296288 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:43:29.690483 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:43:29.690446 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:43:43.187015 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:43:43.186980 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-78c99df468-jxr7j"] Apr 16 13:44:13.867749 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:13.867718 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5889847794-zg6lg_cc078a07-4755-4732-9de2-e597e3972b03/manager/0.log" Apr 16 13:44:16.852469 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:16.852397 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-njk8v_1e8ef80b-df28-4c75-b3be-6327f30d026d/manager/0.log" Apr 16 13:44:17.143110 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:17.143027 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wjgsr_fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a/registry-server/0.log" Apr 16 13:44:17.295467 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:17.295436 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-d27wd_6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d/manager/0.log" Apr 16 13:44:17.446809 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:17.446781 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-jxr7j_43738a75-f09d-4eaf-8a1c-5ba422c1a531/limitador/0.log" Apr 16 13:44:17.582247 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:17.582220 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-7tzl6_d6482a12-0918-4bfb-a033-bea250f1c21f/manager/0.log" Apr 16 13:44:17.950791 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:17.950762 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h_7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b/istio-proxy/0.log" Apr 16 13:44:18.462310 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:18.462283 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-prpq5_074f941c-a2f8-43bd-b264-706a6ceb4802/istio-proxy/0.log" Apr 16 13:44:18.583275 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:18.583252 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68f85f86cb-pvhn6_e8211e71-ff44-4db8-b401-fc031e0edd43/router/0.log" Apr 16 13:44:19.282083 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:19.282058 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg_065e11ee-248e-463c-9f4a-e50b5fc2a5bb/storage-initializer/0.log" Apr 16 13:44:19.289156 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:19.289136 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-699dcc86cjg_065e11ee-248e-463c-9f4a-e50b5fc2a5bb/main/0.log" Apr 16 13:44:23.524158 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524127 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbjl6/must-gather-bfk2x"] Apr 16 13:44:23.524515 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524503 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524515 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524516 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524587 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524538 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524587 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524544 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524648 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524600 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524648 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524610 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524711 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524668 2571 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524711 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524673 2571 state_mem.go:107] "Deleted CPUSet assignment" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.524774 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.524731 2571 memory_manager.go:356] "RemoveStaleState removing state" podUID="5dde31cd-64eb-4faf-871f-2f4f3a49c7a2" containerName="cleanup" Apr 16 13:44:23.528059 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.528039 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.530855 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.530835 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbjl6\"/\"openshift-service-ca.crt\"" Apr 16 13:44:23.531792 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.531779 2571 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gbjl6\"/\"default-dockercfg-tfgq9\"" Apr 16 13:44:23.531856 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.531823 2571 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gbjl6\"/\"kube-root-ca.crt\"" Apr 16 13:44:23.543141 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.543121 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/must-gather-bfk2x"] Apr 16 13:44:23.580349 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.580325 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7d57230-fbd4-45b9-bf79-177af07ceeca-must-gather-output\") pod \"must-gather-bfk2x\" (UID: \"f7d57230-fbd4-45b9-bf79-177af07ceeca\") " pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.580444 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.580376 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ds4p2\" (UniqueName: \"kubernetes.io/projected/f7d57230-fbd4-45b9-bf79-177af07ceeca-kube-api-access-ds4p2\") pod \"must-gather-bfk2x\" (UID: \"f7d57230-fbd4-45b9-bf79-177af07ceeca\") " pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.681620 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.681597 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ds4p2\" (UniqueName: \"kubernetes.io/projected/f7d57230-fbd4-45b9-bf79-177af07ceeca-kube-api-access-ds4p2\") pod \"must-gather-bfk2x\" (UID: \"f7d57230-fbd4-45b9-bf79-177af07ceeca\") " pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.681718 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.681671 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7d57230-fbd4-45b9-bf79-177af07ceeca-must-gather-output\") pod \"must-gather-bfk2x\" (UID: \"f7d57230-fbd4-45b9-bf79-177af07ceeca\") " pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.681963 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.681949 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f7d57230-fbd4-45b9-bf79-177af07ceeca-must-gather-output\") pod \"must-gather-bfk2x\" (UID: \"f7d57230-fbd4-45b9-bf79-177af07ceeca\") " pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.691460 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.691439 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ds4p2\" (UniqueName: \"kubernetes.io/projected/f7d57230-fbd4-45b9-bf79-177af07ceeca-kube-api-access-ds4p2\") pod \"must-gather-bfk2x\" (UID: \"f7d57230-fbd4-45b9-bf79-177af07ceeca\") " pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.846532 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.846466 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/must-gather-bfk2x" Apr 16 13:44:23.978440 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.978414 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/must-gather-bfk2x"] Apr 16 13:44:23.980228 ip-10-0-141-234 kubenswrapper[2571]: W0416 13:44:23.980199 2571 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf7d57230_fbd4_45b9_bf79_177af07ceeca.slice/crio-d2d63eeb185431b9d964415d37cf9d27478762927aa9f4641ddef4cd5d32fda6 WatchSource:0}: Error finding container d2d63eeb185431b9d964415d37cf9d27478762927aa9f4641ddef4cd5d32fda6: Status 404 returned error can't find the container with id d2d63eeb185431b9d964415d37cf9d27478762927aa9f4641ddef4cd5d32fda6 Apr 16 13:44:23.981944 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:23.981923 2571 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:44:24.449600 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:24.449561 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/must-gather-bfk2x" event={"ID":"f7d57230-fbd4-45b9-bf79-177af07ceeca","Type":"ContainerStarted","Data":"d2d63eeb185431b9d964415d37cf9d27478762927aa9f4641ddef4cd5d32fda6"} Apr 16 13:44:25.455822 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:25.455787 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/must-gather-bfk2x" event={"ID":"f7d57230-fbd4-45b9-bf79-177af07ceeca","Type":"ContainerStarted","Data":"b9ac54ee94af670afc45692d950c24ea960f4ec9b1ae40169bd402c96ebfa379"} Apr 16 13:44:25.455822 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:25.455827 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/must-gather-bfk2x" event={"ID":"f7d57230-fbd4-45b9-bf79-177af07ceeca","Type":"ContainerStarted","Data":"e6845d4834012e85284cb35cf2f12edc9b0d713af10a238c14fe1f36da529b07"} Apr 16 13:44:25.472708 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:25.472661 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbjl6/must-gather-bfk2x" podStartSLOduration=1.5205974530000002 podStartE2EDuration="2.472644932s" podCreationTimestamp="2026-04-16 13:44:23 +0000 UTC" firstStartedPulling="2026-04-16 13:44:23.982055089 +0000 UTC m=+1968.864411158" lastFinishedPulling="2026-04-16 13:44:24.934102554 +0000 UTC m=+1969.816458637" observedRunningTime="2026-04-16 13:44:25.470590688 +0000 UTC m=+1970.352946780" watchObservedRunningTime="2026-04-16 13:44:25.472644932 +0000 UTC m=+1970.355001022" Apr 16 13:44:26.558800 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:26.558766 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-md8rb_a9da276b-7d39-4d22-8b85-9d91a9a39f32/global-pull-secret-syncer/0.log" Apr 16 13:44:26.653030 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:26.652997 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-tpklv_1b2f0cf7-6280-412b-b2df-63712a2a8772/konnectivity-agent/0.log" Apr 16 13:44:26.698080 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:26.698051 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-141-234.ec2.internal_01d54e665ffc42a9ee67fe77b01ddd10/haproxy/0.log" Apr 16 13:44:31.005910 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:31.005848 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-njk8v_1e8ef80b-df28-4c75-b3be-6327f30d026d/manager/0.log" Apr 16 13:44:31.072887 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:31.072842 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-wjgsr_fbef5d66-8737-4eaf-8c7e-e9f8e5f7718a/registry-server/0.log" Apr 16 13:44:31.236797 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:31.236765 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6bc9f4c76f-d27wd_6fcf5d2a-f8a7-4d2a-929b-cf4a2923f09d/manager/0.log" Apr 16 13:44:31.265200 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:31.265070 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-78c99df468-jxr7j_43738a75-f09d-4eaf-8a1c-5ba422c1a531/limitador/0.log" Apr 16 13:44:31.353300 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:31.353255 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-7tzl6_d6482a12-0918-4bfb-a033-bea250f1c21f/manager/0.log" Apr 16 13:44:32.913211 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:32.913140 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-7qknj_55e8eb8e-280c-46c9-bcfc-5d796a915163/cluster-monitoring-operator/0.log" Apr 16 13:44:32.941407 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:32.941380 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-bpckk_1f804495-e6ac-4249-a4ec-add4bb87962d/kube-state-metrics/0.log" Apr 16 13:44:32.958400 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:32.958371 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-bpckk_1f804495-e6ac-4249-a4ec-add4bb87962d/kube-rbac-proxy-main/0.log" Apr 16 13:44:32.983088 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:32.983053 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-bpckk_1f804495-e6ac-4249-a4ec-add4bb87962d/kube-rbac-proxy-self/0.log" Apr 16 13:44:33.018898 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.018795 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-6685444475-7kmjj_135cc650-07e2-4bdb-9c50-421e8792e403/metrics-server/0.log" Apr 16 13:44:33.071448 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.071420 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gvcnb_6fe9255b-fcb4-4c6a-a54b-0f8295617b96/node-exporter/0.log" Apr 16 13:44:33.091987 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.091961 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gvcnb_6fe9255b-fcb4-4c6a-a54b-0f8295617b96/kube-rbac-proxy/0.log" Apr 16 13:44:33.111523 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.111501 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gvcnb_6fe9255b-fcb4-4c6a-a54b-0f8295617b96/init-textfile/0.log" Apr 16 13:44:33.386507 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.386430 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/prometheus/0.log" Apr 16 13:44:33.406000 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.405960 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/config-reloader/0.log" Apr 16 13:44:33.430056 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.430025 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/thanos-sidecar/0.log" Apr 16 13:44:33.451408 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.451380 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/kube-rbac-proxy-web/0.log" Apr 16 13:44:33.475252 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.475174 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/kube-rbac-proxy/0.log" Apr 16 13:44:33.495991 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.495947 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/kube-rbac-proxy-thanos/0.log" Apr 16 13:44:33.514185 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.514151 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_40e30a81-7c1d-468e-b2be-f495297f613d/init-config-reloader/0.log" Apr 16 13:44:33.542063 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.542017 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-29686_a6cd6279-f079-4e59-a738-c74e05d8556d/prometheus-operator/0.log" Apr 16 13:44:33.565087 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.565051 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-29686_a6cd6279-f079-4e59-a738-c74e05d8556d/kube-rbac-proxy/0.log" Apr 16 13:44:33.620779 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.620703 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6945cfd779-2pgcb_11525182-7b22-4a16-9a14-6bcdc9289b9d/telemeter-client/0.log" Apr 16 13:44:33.640938 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.640813 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6945cfd779-2pgcb_11525182-7b22-4a16-9a14-6bcdc9289b9d/reload/0.log" Apr 16 13:44:33.663534 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:33.663506 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6945cfd779-2pgcb_11525182-7b22-4a16-9a14-6bcdc9289b9d/kube-rbac-proxy/0.log" Apr 16 13:44:35.290401 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.290367 2571 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn"] Apr 16 13:44:35.294774 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.294746 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.303998 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.303975 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn"] Apr 16 13:44:35.413498 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.413462 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-sys\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.413752 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.413731 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-podres\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.413969 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.413949 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-proc\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.414138 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.414121 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7f64\" (UniqueName: \"kubernetes.io/projected/1a29d893-8009-49ea-9c88-52be8c0cc75b-kube-api-access-f7f64\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.414265 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.414251 2571 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-lib-modules\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515033 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.514989 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7f64\" (UniqueName: \"kubernetes.io/projected/1a29d893-8009-49ea-9c88-52be8c0cc75b-kube-api-access-f7f64\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515221 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515055 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-lib-modules\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515221 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515166 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-sys\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515221 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515192 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-podres\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515398 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515262 2571 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-proc\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515398 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515368 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-proc\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515856 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515828 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-lib-modules\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.515977 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515919 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-sys\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.516036 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.515998 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/1a29d893-8009-49ea-9c88-52be8c0cc75b-podres\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.526705 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.526674 2571 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7f64\" (UniqueName: \"kubernetes.io/projected/1a29d893-8009-49ea-9c88-52be8c0cc75b-kube-api-access-f7f64\") pod \"perf-node-gather-daemonset-smpnn\" (UID: \"1a29d893-8009-49ea-9c88-52be8c0cc75b\") " pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.608652 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.608571 2571 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:35.795213 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:35.794211 2571 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn"] Apr 16 13:44:36.507551 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:36.507513 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" event={"ID":"1a29d893-8009-49ea-9c88-52be8c0cc75b","Type":"ContainerStarted","Data":"0cdd7c7cecc8c89e4283e2ef14ba061495acbe63908a5d1a85db930e3c346453"} Apr 16 13:44:36.507551 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:36.507552 2571 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" event={"ID":"1a29d893-8009-49ea-9c88-52be8c0cc75b","Type":"ContainerStarted","Data":"089588246d588893bf17d2868e58adb1d5cdbe872bd1b30f80e8c22d407893f4"} Apr 16 13:44:36.508082 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:36.507586 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:36.523605 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:36.523548 2571 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" podStartSLOduration=1.523529234 podStartE2EDuration="1.523529234s" podCreationTimestamp="2026-04-16 13:44:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:44:36.522020508 +0000 UTC m=+1981.404376610" watchObservedRunningTime="2026-04-16 13:44:36.523529234 +0000 UTC m=+1981.405885324" Apr 16 13:44:37.388250 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:37.388221 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pcp5l_ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc/dns/0.log" Apr 16 13:44:37.404701 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:37.404676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pcp5l_ed02d9f0-3e2f-4d74-bdd4-e407126d8bbc/kube-rbac-proxy/0.log" Apr 16 13:44:37.465089 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:37.465062 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-sprnh_446804f0-271f-479a-89e6-b8b25ec2e701/dns-node-resolver/0.log" Apr 16 13:44:37.966957 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:37.966921 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-79ddbc568d-wlbcc_71d038c0-6e92-41d6-bb20-ac23854174ca/registry/0.log" Apr 16 13:44:38.004556 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:38.004530 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-njx5f_03556f6f-6803-432c-8012-c40eb6e388ad/node-ca/0.log" Apr 16 13:44:38.814585 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:38.814556 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-5cb8b776cf8tb6h_7aa9d18e-6e93-4aaf-ac74-0d44e9a9486b/istio-proxy/0.log" Apr 16 13:44:39.037107 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:39.037073 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-58b6f876-prpq5_074f941c-a2f8-43bd-b264-706a6ceb4802/istio-proxy/0.log" Apr 16 13:44:39.056099 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:39.056077 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-68f85f86cb-pvhn6_e8211e71-ff44-4db8-b401-fc031e0edd43/router/0.log" Apr 16 13:44:39.550713 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:39.550682 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-rxwhc_e3e13a4d-74db-43e1-b7d8-cddf502adb4c/serve-healthcheck-canary/0.log" Apr 16 13:44:40.021604 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:40.021573 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xdltb_4634cc1d-d3ff-44bd-9edd-28902d1bbd65/insights-operator/0.log" Apr 16 13:44:40.022277 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:40.022242 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-xdltb_4634cc1d-d3ff-44bd-9edd-28902d1bbd65/insights-operator/1.log" Apr 16 13:44:40.148379 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:40.148348 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f7h7d_ae4434f3-4c5d-49c8-b89e-d926620456ca/kube-rbac-proxy/0.log" Apr 16 13:44:40.165062 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:40.165037 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f7h7d_ae4434f3-4c5d-49c8-b89e-d926620456ca/exporter/0.log" Apr 16 13:44:40.182856 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:40.182833 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f7h7d_ae4434f3-4c5d-49c8-b89e-d926620456ca/extractor/0.log" Apr 16 13:44:42.289742 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:42.289700 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-5889847794-zg6lg_cc078a07-4755-4732-9de2-e597e3972b03/manager/0.log" Apr 16 13:44:42.522435 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:42.522409 2571 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gbjl6/perf-node-gather-daemonset-smpnn" Apr 16 13:44:43.550958 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:43.550919 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-64f4647cd-vjj4c_b054e48f-a869-44ce-982d-a9ca5b8dc577/manager/0.log" Apr 16 13:44:49.688386 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.688356 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/kube-multus-additional-cni-plugins/0.log" Apr 16 13:44:49.706407 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.706377 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/egress-router-binary-copy/0.log" Apr 16 13:44:49.725513 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.725491 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/cni-plugins/0.log" Apr 16 13:44:49.743485 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.743459 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/bond-cni-plugin/0.log" Apr 16 13:44:49.760960 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.760932 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/routeoverride-cni/0.log" Apr 16 13:44:49.779711 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.779687 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/whereabouts-cni-bincopy/0.log" Apr 16 13:44:49.797599 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.797582 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nrklc_8921cd74-6824-4c24-896a-5c649cefc5da/whereabouts-cni/0.log" Apr 16 13:44:49.860472 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.860441 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-sxgjd_38405315-7b9b-4c43-82bd-042c8486a193/kube-multus/0.log" Apr 16 13:44:49.939433 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.939361 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h8fnx_f57b15af-9441-4822-9c41-048d94ab4c1a/network-metrics-daemon/0.log" Apr 16 13:44:49.959198 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:49.959174 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-h8fnx_f57b15af-9441-4822-9c41-048d94ab4c1a/kube-rbac-proxy/0.log" Apr 16 13:44:50.836706 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.836676 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-controller/0.log" Apr 16 13:44:50.851851 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.851823 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/0.log" Apr 16 13:44:50.870015 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.869994 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovn-acl-logging/1.log" Apr 16 13:44:50.891163 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.891110 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/kube-rbac-proxy-node/0.log" Apr 16 13:44:50.910206 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.910182 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 13:44:50.924251 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.924233 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/northd/0.log" Apr 16 13:44:50.942388 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.942366 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/nbdb/0.log" Apr 16 13:44:50.959554 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:50.959537 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/sbdb/0.log" Apr 16 13:44:51.164615 ip-10-0-141-234 kubenswrapper[2571]: I0416 13:44:51.164548 2571 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-hdhfq_c031fad3-56b2-4030-9fb7-11cd3421145d/ovnkube-controller/0.log"