Apr 16 16:45:12.254568 ip-10-0-128-130 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 16:45:12.254577 ip-10-0-128-130 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 16:45:12.254584 ip-10-0-128-130 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 16:45:12.254830 ip-10-0-128-130 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 16:45:22.402599 ip-10-0-128-130 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 16:45:22.402617 ip-10-0-128-130 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f5db7ab7b6c0460c90bb7c7bc4fbb0dd -- Apr 16 16:47:46.675842 ip-10-0-128-130 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:47:47.114799 ip-10-0-128-130 kubenswrapper[2568]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:47.114799 ip-10-0-128-130 kubenswrapper[2568]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:47:47.114799 ip-10-0-128-130 kubenswrapper[2568]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:47.114799 ip-10-0-128-130 kubenswrapper[2568]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:47:47.114799 ip-10-0-128-130 kubenswrapper[2568]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:47:47.117735 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.117665 2568 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:47:47.123629 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123615 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:47.123629 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123629 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123633 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123636 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123640 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123644 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123647 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123650 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123653 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123656 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123659 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123662 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123665 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123668 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123670 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123673 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123676 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123679 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123689 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123693 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123696 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:47.123691 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123699 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123702 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123704 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123707 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123710 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123714 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123716 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123719 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123722 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123725 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123727 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123730 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123732 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123735 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123737 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123740 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123742 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123745 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123747 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:47.124165 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123750 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123752 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123756 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123758 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123761 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123763 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123766 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123768 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123770 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123773 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123776 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123784 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123787 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123789 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123792 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123796 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123799 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123801 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123804 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123807 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:47.124681 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123809 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123812 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123815 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123817 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123820 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123822 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123825 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123827 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123830 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123833 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123837 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123840 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123843 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123846 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123849 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123852 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123855 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123858 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123860 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:47.125157 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123862 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123865 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123868 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123871 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123873 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123876 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.123879 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124266 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124272 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124275 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124278 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124281 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124284 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124287 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124289 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124292 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124294 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124297 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124300 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124302 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:47.125630 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124305 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124308 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124310 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124312 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124315 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124318 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124320 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124323 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124326 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124328 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124331 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124334 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124336 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124338 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124341 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124344 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124346 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124354 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124357 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:47.126125 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124359 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124363 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124366 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124368 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124371 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124373 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124376 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124378 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124381 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124383 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124386 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124388 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124390 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124393 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124395 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124398 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124400 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124403 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124405 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124411 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:47.126648 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124414 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124416 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124419 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124421 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124423 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124426 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124429 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124432 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124434 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124437 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124440 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124443 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124446 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124449 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124452 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124455 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124457 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124462 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124465 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:47.127205 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124468 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124471 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124474 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124476 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124479 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124481 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124484 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124486 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124489 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124492 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124494 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124496 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124499 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124501 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.124505 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126077 2568 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126087 2568 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126093 2568 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126097 2568 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126101 2568 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126104 2568 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:47:47.127686 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126109 2568 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126114 2568 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126117 2568 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126120 2568 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126125 2568 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126128 2568 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126131 2568 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126135 2568 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126138 2568 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126141 2568 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126144 2568 flags.go:64] FLAG: --cloud-config="" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126147 2568 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126150 2568 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126158 2568 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126161 2568 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126164 2568 flags.go:64] FLAG: --config-dir="" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126167 2568 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126170 2568 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126179 2568 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126182 2568 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126185 2568 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126189 2568 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126192 2568 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126195 2568 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:47:47.128201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126198 2568 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126201 2568 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126204 2568 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126209 2568 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126211 2568 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126214 2568 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126217 2568 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126220 2568 flags.go:64] FLAG: --enable-server="true" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126223 2568 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126230 2568 flags.go:64] FLAG: --event-burst="100" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126234 2568 flags.go:64] FLAG: --event-qps="50" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126236 2568 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126240 2568 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126249 2568 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126253 2568 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126256 2568 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126259 2568 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126262 2568 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126265 2568 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126268 2568 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126271 2568 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126273 2568 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126276 2568 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126279 2568 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126282 2568 flags.go:64] FLAG: --feature-gates="" Apr 16 16:47:47.128797 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126286 2568 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126289 2568 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126292 2568 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126295 2568 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126298 2568 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126301 2568 flags.go:64] FLAG: --help="false" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126304 2568 flags.go:64] FLAG: --hostname-override="ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126307 2568 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126310 2568 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126313 2568 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126316 2568 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126320 2568 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126323 2568 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126326 2568 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126329 2568 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126332 2568 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126335 2568 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126338 2568 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126341 2568 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126344 2568 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126347 2568 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126356 2568 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126359 2568 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126362 2568 flags.go:64] FLAG: --lock-file="" Apr 16 16:47:47.129408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126364 2568 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126367 2568 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126370 2568 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126376 2568 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126379 2568 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126382 2568 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126385 2568 flags.go:64] FLAG: --logging-format="text" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126388 2568 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126391 2568 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126394 2568 flags.go:64] FLAG: --manifest-url="" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126397 2568 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126401 2568 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126404 2568 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126408 2568 flags.go:64] FLAG: --max-pods="110" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126411 2568 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126414 2568 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126417 2568 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126420 2568 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126423 2568 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126426 2568 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126429 2568 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126435 2568 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126439 2568 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126442 2568 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:47:47.130037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126445 2568 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126448 2568 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126453 2568 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126456 2568 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126459 2568 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126462 2568 flags.go:64] FLAG: --port="10250" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126471 2568 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126474 2568 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04c37a99379e4c67d" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126477 2568 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126480 2568 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126483 2568 flags.go:64] FLAG: --register-node="true" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126485 2568 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126488 2568 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126492 2568 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126494 2568 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126497 2568 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126500 2568 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126503 2568 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126507 2568 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126509 2568 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126512 2568 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126515 2568 flags.go:64] FLAG: --runonce="false" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126518 2568 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126521 2568 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126524 2568 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:47:47.130618 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126526 2568 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126529 2568 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126532 2568 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126535 2568 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126538 2568 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126541 2568 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126544 2568 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126547 2568 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126550 2568 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126553 2568 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126556 2568 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126559 2568 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126565 2568 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126568 2568 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126576 2568 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126581 2568 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126584 2568 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126587 2568 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126590 2568 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126593 2568 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126608 2568 flags.go:64] FLAG: --v="2" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126614 2568 flags.go:64] FLAG: --version="false" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126626 2568 flags.go:64] FLAG: --vmodule="" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126630 2568 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:47:47.131214 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126634 2568 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126727 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126732 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126735 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126738 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126741 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126744 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126747 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126750 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126752 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126755 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126758 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126761 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126763 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126766 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126769 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126771 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126774 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126777 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:47.131872 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126780 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126783 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126785 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126788 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126790 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126793 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126796 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126798 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126801 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126803 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126806 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126810 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126814 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126817 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126819 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126822 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126825 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126828 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126830 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126832 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:47.132441 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126835 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126837 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126840 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126842 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126845 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126848 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126850 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126853 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126855 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126858 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126861 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126863 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126866 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126868 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126871 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126873 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126876 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126879 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126882 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126885 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:47.133283 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126887 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126889 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126892 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126895 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126897 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126900 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126902 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126905 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126908 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126910 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126913 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126915 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126917 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126920 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126922 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126925 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126927 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126930 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126932 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:47.133890 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126938 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126941 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126943 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126946 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126950 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126953 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126956 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126958 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.126961 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:47.134399 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.126968 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:47.134698 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.134680 2568 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:47:47.134728 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.134699 2568 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:47:47.134761 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134748 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:47.134761 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134753 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:47.134761 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134757 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:47.134761 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134760 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:47.134761 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134763 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134766 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134769 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134771 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134774 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134777 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134780 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134782 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134785 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134787 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134790 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134792 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134796 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134798 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134801 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134804 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134806 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134809 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134812 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134815 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:47.134894 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134817 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134820 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134822 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134825 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134828 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134830 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134833 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134835 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134838 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134841 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134843 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134846 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134848 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134851 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134854 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134857 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134860 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134864 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134868 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:47.135381 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134871 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134874 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134877 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134880 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134883 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134886 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134888 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134891 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134893 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134896 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134899 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134902 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134904 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134907 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134909 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134912 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134914 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134917 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134920 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134922 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:47.135933 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134925 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134928 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134930 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134934 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134938 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134941 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134943 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134947 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134949 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134952 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134955 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134957 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134960 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134963 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134965 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134968 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134970 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134973 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134975 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:47.136411 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134978 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134980 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134983 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.134985 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.134991 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135085 2568 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135090 2568 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135093 2568 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135095 2568 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135098 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135101 2568 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135104 2568 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135107 2568 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135109 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135112 2568 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135114 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:47:47.136905 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135117 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135120 2568 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135123 2568 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135125 2568 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135129 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135132 2568 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135134 2568 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135137 2568 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135139 2568 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135142 2568 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135144 2568 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135148 2568 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135151 2568 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135153 2568 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135157 2568 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135160 2568 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135163 2568 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135166 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135168 2568 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:47:47.137295 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135171 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135173 2568 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135176 2568 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135178 2568 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135181 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135184 2568 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135186 2568 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135189 2568 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135191 2568 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135194 2568 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135196 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135199 2568 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135201 2568 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135204 2568 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135206 2568 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135209 2568 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135211 2568 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135215 2568 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135217 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:47:47.137756 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135220 2568 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135224 2568 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135228 2568 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135231 2568 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135234 2568 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135236 2568 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135239 2568 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135242 2568 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135245 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135247 2568 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135250 2568 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135252 2568 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135255 2568 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135257 2568 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135260 2568 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135262 2568 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135265 2568 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135267 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135270 2568 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135272 2568 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:47:47.138226 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135275 2568 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135277 2568 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135280 2568 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135282 2568 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135285 2568 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135288 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135290 2568 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135292 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135295 2568 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135297 2568 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135300 2568 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135303 2568 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135305 2568 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135308 2568 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135310 2568 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135313 2568 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:47:47.138726 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:47.135315 2568 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:47:47.139116 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.135320 2568 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:47:47.139116 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.136409 2568 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:47:47.140447 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.140433 2568 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:47:47.141439 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.141428 2568 server.go:1019] "Starting client certificate rotation" Apr 16 16:47:47.141537 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.141521 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:47.141572 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.141563 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:47:47.167748 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.167730 2568 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:47.172216 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.172201 2568 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:47:47.189186 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.189167 2568 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:47:47.194697 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.194680 2568 log.go:25] "Validated CRI v1 image API" Apr 16 16:47:47.194809 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.194793 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:47.195997 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.195977 2568 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:47:47.198099 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.198081 2568 fs.go:135] Filesystem UUIDs: map[4c71f770-2357-47f4-b50a-9a0d2eb340f6:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e8576557-b745-45c9-b645-c4e33e7f89d7:/dev/nvme0n1p4] Apr 16 16:47:47.198158 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.198099 2568 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:47:47.204550 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.204443 2568 manager.go:217] Machine: {Timestamp:2026-04-16 16:47:47.202500494 +0000 UTC m=+0.411272796 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102007 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22cd004db6e7891d0e3a9013557240 SystemUUID:ec22cd00-4db6-e789-1d0e-3a9013557240 BootID:f5db7ab7-b6c0-460c-90bb-7c7bc4fbb0dd Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:a9:eb:99:c4:1b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:a9:eb:99:c4:1b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:fe:02:ed:42:f2:13 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:47:47.204550 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.204537 2568 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:47:47.204684 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.204622 2568 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:47:47.206843 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.206822 2568 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:47:47.206964 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.206846 2568 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-128-130.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:47:47.207010 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.206973 2568 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:47:47.207010 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.206982 2568 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:47:47.207010 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.206994 2568 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:47.207785 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.207774 2568 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:47:47.208533 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.208523 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:47.208658 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.208649 2568 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:47:47.210928 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.210919 2568 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:47:47.210959 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.210932 2568 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:47:47.210959 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.210943 2568 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:47:47.210959 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.210951 2568 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:47:47.210959 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.210959 2568 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:47:47.211964 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.211953 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:47.212015 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.211979 2568 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:47:47.215095 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.215079 2568 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:47:47.216655 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.216643 2568 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:47:47.217877 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217863 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:47:47.217919 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217881 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:47:47.217919 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217888 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:47:47.217919 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217896 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:47:47.217919 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217905 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:47:47.217919 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217913 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:47:47.217919 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217919 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:47:47.218094 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217924 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:47:47.218094 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217931 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:47:47.218094 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217937 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:47:47.218094 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217957 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:47:47.218094 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.217977 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:47:47.218800 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.218781 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:47:47.218800 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.218802 2568 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:47:47.222290 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.222277 2568 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:47:47.222348 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.222317 2568 server.go:1295] "Started kubelet" Apr 16 16:47:47.222970 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.222790 2568 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:47:47.222970 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.222881 2568 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:47:47.222970 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.222417 2568 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:47:47.223137 ip-10-0-128-130 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:47:47.223448 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.223426 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:47:47.223534 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.223442 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-128-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:47:47.223534 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.223506 2568 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-128-130.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:47:47.224048 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.223945 2568 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:47:47.225621 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.225589 2568 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:47:47.228722 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.228701 2568 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:47.229222 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.229208 2568 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:47:47.229763 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.229742 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.230682 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.230659 2568 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:47:47.230682 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.230667 2568 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:47:47.230818 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.230693 2568 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:47:47.230818 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.230759 2568 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:47:47.230818 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.230779 2568 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:47:47.231049 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.229759 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-130.ec2.internal.18a6e44026af1bc9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-130.ec2.internal,UID:ip-10-0-128-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-128-130.ec2.internal,},FirstTimestamp:2026-04-16 16:47:47.222289353 +0000 UTC m=+0.431061655,LastTimestamp:2026-04-16 16:47:47.222289353 +0000 UTC m=+0.431061655,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-130.ec2.internal,}" Apr 16 16:47:47.231441 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231419 2568 factory.go:55] Registering systemd factory Apr 16 16:47:47.231583 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231492 2568 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:47:47.231750 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231721 2568 factory.go:153] Registering CRI-O factory Apr 16 16:47:47.231823 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231760 2568 factory.go:223] Registration of the crio container factory successfully Apr 16 16:47:47.231823 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231819 2568 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:47:47.231937 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231843 2568 factory.go:103] Registering Raw factory Apr 16 16:47:47.231937 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.231857 2568 manager.go:1196] Started watching for new ooms in manager Apr 16 16:47:47.232356 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.232247 2568 manager.go:319] Starting recovery of all containers Apr 16 16:47:47.237852 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.237823 2568 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-128-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 16:47:47.237951 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.237854 2568 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 16:47:47.244449 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.244433 2568 manager.go:324] Recovery completed Apr 16 16:47:47.245070 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.245051 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fccz2" Apr 16 16:47:47.248264 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.248253 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:47.250304 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250290 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:47.250382 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250316 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:47.250382 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250331 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:47.250514 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250499 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-fccz2" Apr 16 16:47:47.250828 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250809 2568 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:47:47.250828 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250827 2568 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:47:47.250932 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.250843 2568 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:47:47.252295 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.252234 2568 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-128-130.ec2.internal.18a6e440285a8ed1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-128-130.ec2.internal,UID:ip-10-0-128-130.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-128-130.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-128-130.ec2.internal,},FirstTimestamp:2026-04-16 16:47:47.250302673 +0000 UTC m=+0.459074975,LastTimestamp:2026-04-16 16:47:47.250302673 +0000 UTC m=+0.459074975,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-128-130.ec2.internal,}" Apr 16 16:47:47.253559 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.253547 2568 policy_none.go:49] "None policy: Start" Apr 16 16:47:47.253649 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.253564 2568 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:47:47.253649 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.253576 2568 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294124 2568 manager.go:341] "Starting Device Plugin manager" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.294276 2568 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294290 2568 server.go:85] "Starting device plugin registration server" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294481 2568 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294492 2568 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294615 2568 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294715 2568 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.294725 2568 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.295195 2568 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:47:47.297528 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.295238 2568 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.351578 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.351552 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:47:47.352645 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.352628 2568 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:47:47.352747 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.352651 2568 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:47:47.352747 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.352666 2568 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:47:47.352747 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.352674 2568 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:47:47.352747 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.352701 2568 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:47:47.355157 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.355140 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:47.395262 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.395226 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:47.395999 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.395985 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:47.396058 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.396012 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:47.396058 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.396023 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:47.396058 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.396051 2568 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.403759 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.403745 2568 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.403830 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.403764 2568 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-128-130.ec2.internal\": node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.419131 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.419114 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.452774 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.452740 2568 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal"] Apr 16 16:47:47.452852 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.452805 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:47.454533 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.454519 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:47.454611 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.454542 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:47.454611 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.454552 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:47.455868 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.455857 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:47.456012 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456001 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.456052 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456026 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:47.456494 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456478 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:47.456494 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456492 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:47.456657 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456502 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:47.456657 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456514 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:47.456657 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456529 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:47.456657 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.456517 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:47.458059 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.458046 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.458123 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.458070 2568 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:47:47.458814 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.458800 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:47:47.458896 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.458826 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:47:47.458896 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.458838 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:47:47.483112 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.483095 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-130.ec2.internal\" not found" node="ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.487360 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.487347 2568 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-128-130.ec2.internal\" not found" node="ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.519941 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.519923 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.533526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.533504 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3a1a534b3b465bad97fd00e274467c0-config\") pod \"kube-apiserver-proxy-ip-10-0-128-130.ec2.internal\" (UID: \"e3a1a534b3b465bad97fd00e274467c0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.533630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.533535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb09bba6fec8382bf35d52e8c351917-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal\" (UID: \"6eb09bba6fec8382bf35d52e8c351917\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.533630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.533554 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6eb09bba6fec8382bf35d52e8c351917-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal\" (UID: \"6eb09bba6fec8382bf35d52e8c351917\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.620052 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.620035 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.633979 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.633954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb09bba6fec8382bf35d52e8c351917-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal\" (UID: \"6eb09bba6fec8382bf35d52e8c351917\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.634048 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.633986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6eb09bba6fec8382bf35d52e8c351917-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal\" (UID: \"6eb09bba6fec8382bf35d52e8c351917\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.634048 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.634002 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3a1a534b3b465bad97fd00e274467c0-config\") pod \"kube-apiserver-proxy-ip-10-0-128-130.ec2.internal\" (UID: \"e3a1a534b3b465bad97fd00e274467c0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.634048 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.634028 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/e3a1a534b3b465bad97fd00e274467c0-config\") pod \"kube-apiserver-proxy-ip-10-0-128-130.ec2.internal\" (UID: \"e3a1a534b3b465bad97fd00e274467c0\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.634048 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.634039 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/6eb09bba6fec8382bf35d52e8c351917-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal\" (UID: \"6eb09bba6fec8382bf35d52e8c351917\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.634186 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.634064 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6eb09bba6fec8382bf35d52e8c351917-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal\" (UID: \"6eb09bba6fec8382bf35d52e8c351917\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.720355 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.720307 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.786827 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.786813 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.790405 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:47.790388 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:47.820986 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.820959 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:47.921500 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:47.921478 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:48.022080 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:48.022034 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:48.122573 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:48.122555 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:48.142082 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.142063 2568 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:47:48.142190 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.142171 2568 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:47:48.222655 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:48.222620 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:48.228864 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.228843 2568 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:47:48.239807 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.239789 2568 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:47:48.252268 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.252245 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 16:42:47 +0000 UTC" deadline="2027-11-06 17:15:08.573787162 +0000 UTC" Apr 16 16:47:48.252268 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.252267 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13656h27m20.321522686s" Apr 16 16:47:48.263939 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.263921 2568 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-4z5f7" Apr 16 16:47:48.269777 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.269759 2568 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-4z5f7" Apr 16 16:47:48.276243 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:48.276199 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6eb09bba6fec8382bf35d52e8c351917.slice/crio-8a1c591ee5e4b7d5203013e161fc735f39cb9ed085c1138c1dd2e110cce9efea WatchSource:0}: Error finding container 8a1c591ee5e4b7d5203013e161fc735f39cb9ed085c1138c1dd2e110cce9efea: Status 404 returned error can't find the container with id 8a1c591ee5e4b7d5203013e161fc735f39cb9ed085c1138c1dd2e110cce9efea Apr 16 16:47:48.276727 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:48.276710 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3a1a534b3b465bad97fd00e274467c0.slice/crio-50b54ed311bd888b730bbf26e249f3263bea21492e0a4b4d5b77eb1a89793c64 WatchSource:0}: Error finding container 50b54ed311bd888b730bbf26e249f3263bea21492e0a4b4d5b77eb1a89793c64: Status 404 returned error can't find the container with id 50b54ed311bd888b730bbf26e249f3263bea21492e0a4b4d5b77eb1a89793c64 Apr 16 16:47:48.280839 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.280826 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:47:48.323378 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:48.323354 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:48.355687 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.355649 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" event={"ID":"6eb09bba6fec8382bf35d52e8c351917","Type":"ContainerStarted","Data":"8a1c591ee5e4b7d5203013e161fc735f39cb9ed085c1138c1dd2e110cce9efea"} Apr 16 16:47:48.356534 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.356515 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" event={"ID":"e3a1a534b3b465bad97fd00e274467c0","Type":"ContainerStarted","Data":"50b54ed311bd888b730bbf26e249f3263bea21492e0a4b4d5b77eb1a89793c64"} Apr 16 16:47:48.392644 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.392624 2568 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:48.424287 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:48.424269 2568 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-128-130.ec2.internal\" not found" Apr 16 16:47:48.496201 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.496180 2568 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:48.530045 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.530003 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" Apr 16 16:47:48.533821 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.533807 2568 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:48.537845 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.537831 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:48.539033 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.539021 2568 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" Apr 16 16:47:48.548087 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:48.548071 2568 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:47:49.099700 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.099673 2568 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:47:49.212047 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.212018 2568 apiserver.go:52] "Watching apiserver" Apr 16 16:47:49.218198 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.218175 2568 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:47:49.218578 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.218557 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-kznnf","openshift-multus/network-metrics-daemon-gd4q4","openshift-network-operator/iptables-alerter-hsvxm","kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal","openshift-cluster-node-tuning-operator/tuned-fl64p","openshift-image-registry/node-ca-8slrg","openshift-multus/multus-additional-cni-plugins-hjnws","openshift-network-diagnostics/network-check-target-bwvhw","openshift-ovn-kubernetes/ovnkube-node-5pfjs","kube-system/konnectivity-agent-srdnj","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96","openshift-dns/node-resolver-v87xm","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal"] Apr 16 16:47:49.221996 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.221973 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.223433 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.223404 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.223545 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.223491 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:47:49.224307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.224284 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:47:49.224402 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.224374 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-88mzj\"" Apr 16 16:47:49.224466 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.224381 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:47:49.224671 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.224631 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.224760 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.224699 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.225880 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.225860 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.227269 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.227240 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.227991 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.227952 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:47:49.228089 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228044 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.228089 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228072 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.228274 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228254 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.228376 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228302 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.228376 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228305 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.228538 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:49.228893 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.228582 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:47:49.228893 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228686 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-jx86s\"" Apr 16 16:47:49.228893 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228723 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:47:49.228893 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228871 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.229097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228897 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-b4xpn\"" Apr 16 16:47:49.229097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.228933 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fjtgd\"" Apr 16 16:47:49.229404 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.229387 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:47:49.229493 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.229442 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.229789 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.229496 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:47:49.230659 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.230636 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.230866 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.230729 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:47:49.232362 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.231082 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-f8b9c\"" Apr 16 16:47:49.232460 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.232443 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.233893 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.233516 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.233893 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.233542 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:47:49.234307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.234287 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:47:49.234407 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.234361 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.234492 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.234475 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6nfnn\"" Apr 16 16:47:49.234588 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.234566 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:47:49.235502 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.235485 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.235866 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.235845 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:47:49.235977 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.235965 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.236486 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.236465 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-657sj\"" Apr 16 16:47:49.236574 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.236470 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:47:49.236794 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.236778 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.237358 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.237338 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.238185 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.237830 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.238185 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.237921 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:47:49.238185 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.238148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-4shpx\"" Apr 16 16:47:49.239180 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.238914 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:47:49.239180 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.239008 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:47:49.239372 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.239354 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-l4c4x\"" Apr 16 16:47:49.242960 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.242939 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-node-log\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243062 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.242971 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-cnibin\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.243062 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.242990 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-kubelet\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243062 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243009 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-cni-bin\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.243062 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243033 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-systemd\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243062 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243056 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-ovn\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243093 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-ovnkube-config\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243123 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g4wjj\" (UniqueName: \"kubernetes.io/projected/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-kube-api-access-g4wjj\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243149 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-cni-netd\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243173 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-host-slash\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243197 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f46c58f-ae64-4611-b6f5-37bccf98d4af-cni-binary-copy\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243219 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fcf2d50-decf-4050-b3bc-a82043f228fe-tmp\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243240 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5d59ae73-e194-4f66-9f72-a091634a4c01-serviceca\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243261 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-slash\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243286 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243308 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysconfig\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243328 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-lib-modules\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243356 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-host\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243380 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2v9rr\" (UniqueName: \"kubernetes.io/projected/5d59ae73-e194-4f66-9f72-a091634a4c01-kube-api-access-2v9rr\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243403 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-var-lib-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243436 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2skjb\" (UniqueName: \"kubernetes.io/projected/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-kube-api-access-2skjb\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243473 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-run\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243494 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-sys\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243515 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-etc-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243537 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-kubelet\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243559 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-conf-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243581 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-modprobe-d\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243635 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-env-overrides\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243665 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-socket-dir-parent\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.243721 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-system-cni-dir\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243735 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d7c70c5-3d64-495a-a048-265fbd988013-ovn-node-metrics-cert\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243802 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttlm4\" (UniqueName: \"kubernetes.io/projected/2d7c70c5-3d64-495a-a048-265fbd988013-kube-api-access-ttlm4\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243847 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-os-release\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243887 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-cni-bin\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243927 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-cni-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.243984 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-var-lib-kubelet\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244009 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-tuned\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244035 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d59ae73-e194-4f66-9f72-a091634a4c01-host\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244058 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-run-netns\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244081 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-systemd\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244106 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-iptables-alerter-script\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244161 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-ovnkube-script-lib\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244300 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244207 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42ba454f-7aed-4795-a3c0-cbc87b83def8-agent-certs\") pod \"konnectivity-agent-srdnj\" (UID: \"42ba454f-7aed-4795-a3c0-cbc87b83def8\") " pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244242 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-system-cni-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244277 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-os-release\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244313 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsnrc\" (UniqueName: \"kubernetes.io/projected/7fcf2d50-decf-4050-b3bc-a82043f228fe-kube-api-access-gsnrc\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244395 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244421 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-systemd-units\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-k8s-cni-cncf-io\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244483 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-multus-certs\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244513 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-etc-kubernetes\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244533 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cvcd\" (UniqueName: \"kubernetes.io/projected/6f46c58f-ae64-4611-b6f5-37bccf98d4af-kube-api-access-7cvcd\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244557 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-kubernetes\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244609 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244647 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-netns\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244668 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-daemon-config\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244690 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysctl-conf\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.244900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244716 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cnibin\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42ba454f-7aed-4795-a3c0-cbc87b83def8-konnectivity-ca\") pod \"konnectivity-agent-srdnj\" (UID: \"42ba454f-7aed-4795-a3c0-cbc87b83def8\") " pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244760 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysctl-d\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244782 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244807 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-log-socket\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244832 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244857 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-cni-multus\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244899 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-hostroot\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.245526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.244923 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjvtv\" (UniqueName: \"kubernetes.io/projected/545dd230-1d90-4e1a-8615-072dd9b2d2f5-kube-api-access-qjvtv\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.270467 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.270437 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:48 +0000 UTC" deadline="2027-10-23 09:10:25.231012823 +0000 UTC" Apr 16 16:47:49.270558 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.270467 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13312h22m35.960549394s" Apr 16 16:47:49.332038 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.332011 2568 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:47:49.346304 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346274 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:49.346429 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346309 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-cni-multus\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.346429 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346333 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-hostroot\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.346429 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346361 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjvtv\" (UniqueName: \"kubernetes.io/projected/545dd230-1d90-4e1a-8615-072dd9b2d2f5-kube-api-access-qjvtv\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.346429 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346381 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-node-log\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.346429 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-cnibin\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.346665 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346449 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-kubelet\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.346665 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346514 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-cni-bin\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.346665 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346559 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-systemd\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.346665 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346621 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-tmp-dir\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.346665 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-ovn\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.346906 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-kubelet\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.346906 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-ovnkube-config\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.346906 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346742 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-node-log\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.346906 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346747 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-cni-bin\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.346906 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346754 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g4wjj\" (UniqueName: \"kubernetes.io/projected/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-kube-api-access-g4wjj\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.346906 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346848 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-systemd\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346929 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-ovn\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346953 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-hostroot\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347125 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-cni-netd\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347165 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-host-slash\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347194 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f46c58f-ae64-4611-b6f5-37bccf98d4af-cni-binary-copy\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.346765 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-cni-multus\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.347230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347228 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fcf2d50-decf-4050-b3bc-a82043f228fe-tmp\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347266 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-etc-selinux\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347278 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-cnibin\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347284 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-host-slash\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-sys-fs\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347337 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-cni-netd\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347339 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5d59ae73-e194-4f66-9f72-a091634a4c01-serviceca\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-slash\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347409 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347442 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysconfig\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347467 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-lib-modules\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347498 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-host\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.347553 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347533 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2v9rr\" (UniqueName: \"kubernetes.io/projected/5d59ae73-e194-4f66-9f72-a091634a4c01-kube-api-access-2v9rr\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347567 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-var-lib-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347610 2568 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347678 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-host\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347721 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-lib-modules\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5d59ae73-e194-4f66-9f72-a091634a4c01-serviceca\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347845 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-var-lib-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347894 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347915 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-ovnkube-config\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347939 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-slash\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.347618 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2skjb\" (UniqueName: \"kubernetes.io/projected/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-kube-api-access-2skjb\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348034 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-run\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348059 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-sys\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348089 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-etc-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348094 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysconfig\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348115 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-kubelet\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348170 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-var-lib-kubelet\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348174 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-sys\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348226 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-etc-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348235 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-run\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348274 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-conf-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348314 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-modprobe-d\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348350 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-device-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348388 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-conf-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348391 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6f46c58f-ae64-4611-b6f5-37bccf98d4af-cni-binary-copy\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-env-overrides\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348450 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-socket-dir-parent\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348469 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-modprobe-d\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348487 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-system-cni-dir\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348523 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-socket-dir-parent\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348577 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d7c70c5-3d64-495a-a048-265fbd988013-ovn-node-metrics-cert\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttlm4\" (UniqueName: \"kubernetes.io/projected/2d7c70c5-3d64-495a-a048-265fbd988013-kube-api-access-ttlm4\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.348837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348710 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-system-cni-dir\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348712 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-openvswitch\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348746 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-env-overrides\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348860 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-os-release\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348897 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-cni-bin\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348928 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-cni-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-var-lib-kubelet\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348984 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-tuned\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.348989 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-os-release\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349021 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phnpq\" (UniqueName: \"kubernetes.io/projected/635f4e42-2ddd-46d3-8a86-47f36f350728-kube-api-access-phnpq\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349053 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cds7m\" (UniqueName: \"kubernetes.io/projected/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-kube-api-access-cds7m\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349083 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d59ae73-e194-4f66-9f72-a091634a4c01-host\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349116 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-run-netns\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349141 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-cni-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349144 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-systemd\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349203 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-iptables-alerter-script\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349219 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-var-lib-kubelet\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.349563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349206 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-run-systemd\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349262 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-cni-bin\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349272 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349299 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-run-netns\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349361 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5d59ae73-e194-4f66-9f72-a091634a4c01-host\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.349420 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.349524 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:49.849474494 +0000 UTC m=+3.058246782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349768 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-socket-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349810 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-hosts-file\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349849 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349882 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-ovnkube-script-lib\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349910 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42ba454f-7aed-4795-a3c0-cbc87b83def8-agent-certs\") pod \"konnectivity-agent-srdnj\" (UID: \"42ba454f-7aed-4795-a3c0-cbc87b83def8\") " pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349920 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-iptables-alerter-script\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349943 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-system-cni-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.349986 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-os-release\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350021 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gsnrc\" (UniqueName: \"kubernetes.io/projected/7fcf2d50-decf-4050-b3bc-a82043f228fe-kube-api-access-gsnrc\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.350455 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350087 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350124 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-systemd-units\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-k8s-cni-cncf-io\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350192 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-multus-certs\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350226 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-etc-kubernetes\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350255 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cvcd\" (UniqueName: \"kubernetes.io/projected/6f46c58f-ae64-4611-b6f5-37bccf98d4af-kube-api-access-7cvcd\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350289 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-kubernetes\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350322 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350354 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-netns\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350389 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-daemon-config\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350415 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysctl-conf\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350446 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-kubelet-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350478 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-registration-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350513 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cnibin\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42ba454f-7aed-4795-a3c0-cbc87b83def8-konnectivity-ca\") pod \"konnectivity-agent-srdnj\" (UID: \"42ba454f-7aed-4795-a3c0-cbc87b83def8\") " pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350582 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysctl-d\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350633 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.351254 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-log-socket\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350766 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-log-socket\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.350831 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-host-run-ovn-kubernetes\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351365 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/2d7c70c5-3d64-495a-a048-265fbd988013-ovnkube-script-lib\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351447 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-kubernetes\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351507 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cnibin\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351532 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-multus-certs\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351556 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-k8s-cni-cncf-io\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351833 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/2d7c70c5-3d64-495a-a048-265fbd988013-systemd-units\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351878 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-etc-kubernetes\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.351971 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cni-binary-copy\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352059 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.352097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352092 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-host-run-netns\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.352708 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352120 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.352708 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352192 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/2d7c70c5-3d64-495a-a048-265fbd988013-ovn-node-metrics-cert\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.352708 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352480 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/42ba454f-7aed-4795-a3c0-cbc87b83def8-konnectivity-ca\") pod \"konnectivity-agent-srdnj\" (UID: \"42ba454f-7aed-4795-a3c0-cbc87b83def8\") " pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.352708 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352665 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.352908 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352798 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysctl-d\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.352908 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352874 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-system-cni-dir\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.353005 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.352967 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6f46c58f-ae64-4611-b6f5-37bccf98d4af-multus-daemon-config\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.353105 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.353072 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6f46c58f-ae64-4611-b6f5-37bccf98d4af-os-release\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.353373 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.353343 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-sysctl-conf\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.354587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.354529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7fcf2d50-decf-4050-b3bc-a82043f228fe-tmp\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.355713 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.355195 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7fcf2d50-decf-4050-b3bc-a82043f228fe-etc-tuned\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.356851 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.356834 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:49.356981 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.356968 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:49.357102 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.357083 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:49.358081 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.358058 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:49.858036605 +0000 UTC m=+3.066808897 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:49.358231 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.357061 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/42ba454f-7aed-4795-a3c0-cbc87b83def8-agent-certs\") pod \"konnectivity-agent-srdnj\" (UID: \"42ba454f-7aed-4795-a3c0-cbc87b83def8\") " pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.359085 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.359062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjvtv\" (UniqueName: \"kubernetes.io/projected/545dd230-1d90-4e1a-8615-072dd9b2d2f5-kube-api-access-qjvtv\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.359389 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.359350 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2v9rr\" (UniqueName: \"kubernetes.io/projected/5d59ae73-e194-4f66-9f72-a091634a4c01-kube-api-access-2v9rr\") pod \"node-ca-8slrg\" (UID: \"5d59ae73-e194-4f66-9f72-a091634a4c01\") " pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.360218 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.360196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttlm4\" (UniqueName: \"kubernetes.io/projected/2d7c70c5-3d64-495a-a048-265fbd988013-kube-api-access-ttlm4\") pod \"ovnkube-node-5pfjs\" (UID: \"2d7c70c5-3d64-495a-a048-265fbd988013\") " pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.360518 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.360495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g4wjj\" (UniqueName: \"kubernetes.io/projected/da105279-c3bd-4e13-9bfc-0331c0b3ebd0-kube-api-access-g4wjj\") pod \"multus-additional-cni-plugins-hjnws\" (UID: \"da105279-c3bd-4e13-9bfc-0331c0b3ebd0\") " pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.360629 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.360531 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2skjb\" (UniqueName: \"kubernetes.io/projected/bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28-kube-api-access-2skjb\") pod \"iptables-alerter-hsvxm\" (UID: \"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28\") " pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.361725 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.361705 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cvcd\" (UniqueName: \"kubernetes.io/projected/6f46c58f-ae64-4611-b6f5-37bccf98d4af-kube-api-access-7cvcd\") pod \"multus-kznnf\" (UID: \"6f46c58f-ae64-4611-b6f5-37bccf98d4af\") " pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.363543 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.363524 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsnrc\" (UniqueName: \"kubernetes.io/projected/7fcf2d50-decf-4050-b3bc-a82043f228fe-kube-api-access-gsnrc\") pod \"tuned-fl64p\" (UID: \"7fcf2d50-decf-4050-b3bc-a82043f228fe\") " pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.451446 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451420 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-device-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451633 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451462 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phnpq\" (UniqueName: \"kubernetes.io/projected/635f4e42-2ddd-46d3-8a86-47f36f350728-kube-api-access-phnpq\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451633 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451486 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cds7m\" (UniqueName: \"kubernetes.io/projected/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-kube-api-access-cds7m\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.451633 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451539 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-device-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451633 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451611 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-socket-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451639 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-hosts-file\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451672 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-kubelet-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451693 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-registration-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451735 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-hosts-file\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451757 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-socket-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451766 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-tmp-dir\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451763 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-registration-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-kubelet-dir\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451819 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-etc-selinux\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.451865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451843 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-sys-fs\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.452223 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451916 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-sys-fs\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.452223 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.451937 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/635f4e42-2ddd-46d3-8a86-47f36f350728-etc-selinux\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.452223 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.452053 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-tmp-dir\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.459656 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.459635 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cds7m\" (UniqueName: \"kubernetes.io/projected/b60bf4ff-6c52-4d90-9a63-32a829bfc83e-kube-api-access-cds7m\") pod \"node-resolver-v87xm\" (UID: \"b60bf4ff-6c52-4d90-9a63-32a829bfc83e\") " pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.459774 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.459754 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phnpq\" (UniqueName: \"kubernetes.io/projected/635f4e42-2ddd-46d3-8a86-47f36f350728-kube-api-access-phnpq\") pod \"aws-ebs-csi-driver-node-29c96\" (UID: \"635f4e42-2ddd-46d3-8a86-47f36f350728\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.536753 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.536720 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:47:49.543391 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.543369 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-hsvxm" Apr 16 16:47:49.551940 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.551916 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-fl64p" Apr 16 16:47:49.556453 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.556436 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-8slrg" Apr 16 16:47:49.562993 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.562970 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-hjnws" Apr 16 16:47:49.568529 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.568513 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:47:49.574087 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.574069 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-kznnf" Apr 16 16:47:49.579727 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.579710 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" Apr 16 16:47:49.584240 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.584224 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-v87xm" Apr 16 16:47:49.838028 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.837998 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42ba454f_7aed_4795_a3c0_cbc87b83def8.slice/crio-497e3160980de6e28c8462ebc13361536a4cd2ee5cd674868f41f2cd53016bea WatchSource:0}: Error finding container 497e3160980de6e28c8462ebc13361536a4cd2ee5cd674868f41f2cd53016bea: Status 404 returned error can't find the container with id 497e3160980de6e28c8462ebc13361536a4cd2ee5cd674868f41f2cd53016bea Apr 16 16:47:49.840550 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.840446 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d59ae73_e194_4f66_9f72_a091634a4c01.slice/crio-07babb17d15eb20f5ebb8b141b9a898449d5b01c7dff751d627f7c7686308130 WatchSource:0}: Error finding container 07babb17d15eb20f5ebb8b141b9a898449d5b01c7dff751d627f7c7686308130: Status 404 returned error can't find the container with id 07babb17d15eb20f5ebb8b141b9a898449d5b01c7dff751d627f7c7686308130 Apr 16 16:47:49.844338 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.844310 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fcf2d50_decf_4050_b3bc_a82043f228fe.slice/crio-aaccb1077a61389c2913fac70b121b4a267b61e33c9ac91aef47aa5b491fb306 WatchSource:0}: Error finding container aaccb1077a61389c2913fac70b121b4a267b61e33c9ac91aef47aa5b491fb306: Status 404 returned error can't find the container with id aaccb1077a61389c2913fac70b121b4a267b61e33c9ac91aef47aa5b491fb306 Apr 16 16:47:49.845105 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.845079 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda105279_c3bd_4e13_9bfc_0331c0b3ebd0.slice/crio-a61ba2d7ec0fa5294f5b0d5c94c635200158f532807fd3cdb64f05f52231fedf WatchSource:0}: Error finding container a61ba2d7ec0fa5294f5b0d5c94c635200158f532807fd3cdb64f05f52231fedf: Status 404 returned error can't find the container with id a61ba2d7ec0fa5294f5b0d5c94c635200158f532807fd3cdb64f05f52231fedf Apr 16 16:47:49.845741 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.845717 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd61ad6d_349b_4f2f_ba4c_bbe9aaf1fb28.slice/crio-0ba37df5ce9f7666620ed44997f3ed47495ec3f2633b548530f3e7b7796c41eb WatchSource:0}: Error finding container 0ba37df5ce9f7666620ed44997f3ed47495ec3f2633b548530f3e7b7796c41eb: Status 404 returned error can't find the container with id 0ba37df5ce9f7666620ed44997f3ed47495ec3f2633b548530f3e7b7796c41eb Apr 16 16:47:49.847036 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.846716 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d7c70c5_3d64_495a_a048_265fbd988013.slice/crio-fe5636864b74d94a13435cbb9a3bfcaeaced1e9441ee20f5e9de7beec6c7d656 WatchSource:0}: Error finding container fe5636864b74d94a13435cbb9a3bfcaeaced1e9441ee20f5e9de7beec6c7d656: Status 404 returned error can't find the container with id fe5636864b74d94a13435cbb9a3bfcaeaced1e9441ee20f5e9de7beec6c7d656 Apr 16 16:47:49.847705 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.847685 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod635f4e42_2ddd_46d3_8a86_47f36f350728.slice/crio-9b1e1e92fdcb5ce71c3630b80c019d0a422d9a874937662fbc298827e7e7cf05 WatchSource:0}: Error finding container 9b1e1e92fdcb5ce71c3630b80c019d0a422d9a874937662fbc298827e7e7cf05: Status 404 returned error can't find the container with id 9b1e1e92fdcb5ce71c3630b80c019d0a422d9a874937662fbc298827e7e7cf05 Apr 16 16:47:49.848784 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.848747 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb60bf4ff_6c52_4d90_9a63_32a829bfc83e.slice/crio-75723c6a5fbc8095ec4ed4bb8dbad6241c840c68b3d7ef9272e6efad37d68dad WatchSource:0}: Error finding container 75723c6a5fbc8095ec4ed4bb8dbad6241c840c68b3d7ef9272e6efad37d68dad: Status 404 returned error can't find the container with id 75723c6a5fbc8095ec4ed4bb8dbad6241c840c68b3d7ef9272e6efad37d68dad Apr 16 16:47:49.849820 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:47:49.849766 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f46c58f_ae64_4611_b6f5_37bccf98d4af.slice/crio-f1267a59332f6bac6ca28f9b20f5024390120a39ee4701667c55a2de4f449630 WatchSource:0}: Error finding container f1267a59332f6bac6ca28f9b20f5024390120a39ee4701667c55a2de4f449630: Status 404 returned error can't find the container with id f1267a59332f6bac6ca28f9b20f5024390120a39ee4701667c55a2de4f449630 Apr 16 16:47:49.854967 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.854791 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:49.854967 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.854943 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:49.855150 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.854997 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:50.854980375 +0000 UTC m=+4.063752678 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:49.955909 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:49.955768 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:49.956007 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.955914 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:49.956007 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.955931 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:49.956007 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.955940 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:49.956007 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:49.955986 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:50.955973332 +0000 UTC m=+4.164745620 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:50.271695 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.271574 2568 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 16:42:48 +0000 UTC" deadline="2028-01-27 19:47:23.661639054 +0000 UTC" Apr 16 16:47:50.271695 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.271625 2568 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15626h59m33.390020281s" Apr 16 16:47:50.370047 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.369341 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" event={"ID":"e3a1a534b3b465bad97fd00e274467c0","Type":"ContainerStarted","Data":"95f7745b6ed28206c16c57e8b954d6a1219185846bad08a343be385cb8cba4cc"} Apr 16 16:47:50.393037 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.390936 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kznnf" event={"ID":"6f46c58f-ae64-4611-b6f5-37bccf98d4af","Type":"ContainerStarted","Data":"f1267a59332f6bac6ca28f9b20f5024390120a39ee4701667c55a2de4f449630"} Apr 16 16:47:50.398016 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.397984 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" event={"ID":"635f4e42-2ddd-46d3-8a86-47f36f350728","Type":"ContainerStarted","Data":"9b1e1e92fdcb5ce71c3630b80c019d0a422d9a874937662fbc298827e7e7cf05"} Apr 16 16:47:50.402334 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.402308 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8slrg" event={"ID":"5d59ae73-e194-4f66-9f72-a091634a4c01","Type":"ContainerStarted","Data":"07babb17d15eb20f5ebb8b141b9a898449d5b01c7dff751d627f7c7686308130"} Apr 16 16:47:50.407979 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.407957 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v87xm" event={"ID":"b60bf4ff-6c52-4d90-9a63-32a829bfc83e","Type":"ContainerStarted","Data":"75723c6a5fbc8095ec4ed4bb8dbad6241c840c68b3d7ef9272e6efad37d68dad"} Apr 16 16:47:50.422092 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.422065 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"fe5636864b74d94a13435cbb9a3bfcaeaced1e9441ee20f5e9de7beec6c7d656"} Apr 16 16:47:50.426196 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.426170 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hsvxm" event={"ID":"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28","Type":"ContainerStarted","Data":"0ba37df5ce9f7666620ed44997f3ed47495ec3f2633b548530f3e7b7796c41eb"} Apr 16 16:47:50.431909 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.431885 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerStarted","Data":"a61ba2d7ec0fa5294f5b0d5c94c635200158f532807fd3cdb64f05f52231fedf"} Apr 16 16:47:50.446156 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.446109 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fl64p" event={"ID":"7fcf2d50-decf-4050-b3bc-a82043f228fe","Type":"ContainerStarted","Data":"aaccb1077a61389c2913fac70b121b4a267b61e33c9ac91aef47aa5b491fb306"} Apr 16 16:47:50.454898 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.454873 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-srdnj" event={"ID":"42ba454f-7aed-4795-a3c0-cbc87b83def8","Type":"ContainerStarted","Data":"497e3160980de6e28c8462ebc13361536a4cd2ee5cd674868f41f2cd53016bea"} Apr 16 16:47:50.865417 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.865336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:50.865565 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:50.865479 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:50.865565 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:50.865541 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:52.865521307 +0000 UTC m=+6.074293600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:50.966416 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:50.965817 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:50.966416 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:50.965992 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:50.966416 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:50.966012 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:50.966416 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:50.966024 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:50.966416 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:50.966081 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:52.966062895 +0000 UTC m=+6.174835187 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:51.353840 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:51.353767 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:51.354263 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:51.353901 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:47:51.354362 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:51.354345 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:51.354457 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:51.354437 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:47:51.478735 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:51.477503 2568 generic.go:358] "Generic (PLEG): container finished" podID="6eb09bba6fec8382bf35d52e8c351917" containerID="02a0f9ccd4b0fa664a4d7628c0b46043ab581e521bb8a62647012867c7d8c2c7" exitCode=0 Apr 16 16:47:51.478735 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:51.478504 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" event={"ID":"6eb09bba6fec8382bf35d52e8c351917","Type":"ContainerDied","Data":"02a0f9ccd4b0fa664a4d7628c0b46043ab581e521bb8a62647012867c7d8c2c7"} Apr 16 16:47:51.493041 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:51.492019 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-128-130.ec2.internal" podStartSLOduration=3.492004958 podStartE2EDuration="3.492004958s" podCreationTimestamp="2026-04-16 16:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:50.389632677 +0000 UTC m=+3.598404990" watchObservedRunningTime="2026-04-16 16:47:51.492004958 +0000 UTC m=+4.700777271" Apr 16 16:47:52.491877 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:52.491056 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" event={"ID":"6eb09bba6fec8382bf35d52e8c351917","Type":"ContainerStarted","Data":"c3492212350efd1b1505ef726eb516b5582ceb090c6c2e4960a7ce527d50dc3a"} Apr 16 16:47:52.506155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:52.506108 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-128-130.ec2.internal" podStartSLOduration=4.506091432 podStartE2EDuration="4.506091432s" podCreationTimestamp="2026-04-16 16:47:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:47:52.505552616 +0000 UTC m=+5.714324927" watchObservedRunningTime="2026-04-16 16:47:52.506091432 +0000 UTC m=+5.714863745" Apr 16 16:47:52.881868 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:52.881657 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:52.881868 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:52.881821 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:52.882082 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:52.881900 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:56.881878495 +0000 UTC m=+10.090650798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:52.982554 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:52.982517 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:52.982719 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:52.982683 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:52.982719 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:52.982704 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:52.982719 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:52.982717 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:52.982884 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:52.982777 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:47:56.982759284 +0000 UTC m=+10.191531578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:53.354695 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:53.353676 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:53.354695 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:53.353815 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:47:53.354695 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:53.354188 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:53.354695 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:53.354293 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:47:55.352924 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:55.352893 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:55.353378 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:55.352937 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:55.353378 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:55.353059 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:47:55.353378 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:55.353145 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:47:56.915105 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:56.915073 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:56.915563 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:56.915213 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:56.915563 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:56.915281 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:04.915261345 +0000 UTC m=+18.124033637 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:47:57.016678 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:57.016096 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:57.016678 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:57.016244 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:47:57.016678 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:57.016262 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:47:57.016678 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:57.016275 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:57.016678 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:57.016329 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:05.016311973 +0000 UTC m=+18.225084265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:47:57.353832 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:57.353753 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:57.353984 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:57.353865 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:47:57.353984 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:57.353917 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:57.354081 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:57.354040 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:47:59.353068 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:59.353032 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:47:59.353520 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:59.353144 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:47:59.353520 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:47:59.353430 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:47:59.353520 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:47:59.353501 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:01.353870 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:01.353840 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:01.354338 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:01.353955 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:01.354338 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:01.354018 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:01.354338 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:01.354154 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:03.353279 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:03.353247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:03.353739 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:03.353363 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:03.353739 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:03.353391 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:03.353739 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:03.353520 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:04.975073 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:04.975040 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:04.975491 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:04.975178 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:04.975491 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:04.975242 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.975225879 +0000 UTC m=+34.183998171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:05.075434 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:05.075404 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:05.075611 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:05.075546 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:05.075611 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:05.075561 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:05.075611 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:05.075569 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:05.075729 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:05.075631 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:21.075614173 +0000 UTC m=+34.284386462 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:05.352942 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:05.352854 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:05.353090 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:05.352977 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:05.353090 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:05.353017 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:05.353206 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:05.353098 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:07.353707 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.353402 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:07.354236 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.353451 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:07.354236 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:07.353789 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:07.354236 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:07.353904 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:07.517914 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.517883 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-v87xm" event={"ID":"b60bf4ff-6c52-4d90-9a63-32a829bfc83e","Type":"ContainerStarted","Data":"c7f9bbb150676e499845e134e6cabf48ff195cfb2aaaa2b0e6e5a1d268761576"} Apr 16 16:48:07.521155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.520982 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"ad67c48c56a26920b3c9fc4a270056dd5d0bb4c7d412468cf99130d8ef40e988"} Apr 16 16:48:07.521155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.521014 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"a22babba5b5ff9f156b7ea49261a55a68dec881545a9d3b00cdab1a24a217ee4"} Apr 16 16:48:07.521155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.521029 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"a95d210ea95fcd5662bd5c7dbf2a452f17562f9e6b36f61a9aef90c6365324f3"} Apr 16 16:48:07.521155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.521044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"82e8140e18fc4fe4b960936b8f4d5912ec7c1ecbcbe967f32032ece402dd55cf"} Apr 16 16:48:07.521155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.521068 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"c3f12d4f04627a0a727868932b23252a3ebf07759852fb78e0e75dc210ce78bd"} Apr 16 16:48:07.521155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.521081 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"641058a370e03df4ae235ad388a73f7fb94e1d8b5cad760decdb09a459cf5ec0"} Apr 16 16:48:07.523147 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.523125 2568 generic.go:358] "Generic (PLEG): container finished" podID="da105279-c3bd-4e13-9bfc-0331c0b3ebd0" containerID="0394fbbbc0939a2d42f213af33ffd8b8d42c5176b38b67bc01b9e9689f84e579" exitCode=0 Apr 16 16:48:07.523269 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.523189 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerDied","Data":"0394fbbbc0939a2d42f213af33ffd8b8d42c5176b38b67bc01b9e9689f84e579"} Apr 16 16:48:07.525293 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.525259 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-fl64p" event={"ID":"7fcf2d50-decf-4050-b3bc-a82043f228fe","Type":"ContainerStarted","Data":"600ae4ad813580d3ba089f0e4ac768bdd8e2b6f76e34b1852a86374c2fe1b82c"} Apr 16 16:48:07.527707 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.527683 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-srdnj" event={"ID":"42ba454f-7aed-4795-a3c0-cbc87b83def8","Type":"ContainerStarted","Data":"de049a6fb16d04b5f1a2a90f3c106a363830b94c334fdb4de9714e120d27cb5e"} Apr 16 16:48:07.530449 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.530425 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-kznnf" event={"ID":"6f46c58f-ae64-4611-b6f5-37bccf98d4af","Type":"ContainerStarted","Data":"c2442e088a1166f422d0f6d9092da611be8685a744fd1cc3962cf474f58e917d"} Apr 16 16:48:07.532063 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.532044 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" event={"ID":"635f4e42-2ddd-46d3-8a86-47f36f350728","Type":"ContainerStarted","Data":"dce85c16e05a8f0473d1bbabb89071206f3ac8246757dece22e1a803fd8f64f9"} Apr 16 16:48:07.533639 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.533618 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-8slrg" event={"ID":"5d59ae73-e194-4f66-9f72-a091634a4c01","Type":"ContainerStarted","Data":"4aeffa78a49226301f5a7f54be965d7c9434ca673083a8b53cee3e93748b7ef7"} Apr 16 16:48:07.535140 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.535056 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-v87xm" podStartSLOduration=3.523045258 podStartE2EDuration="20.535045881s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.850672223 +0000 UTC m=+3.059444514" lastFinishedPulling="2026-04-16 16:48:06.862672838 +0000 UTC m=+20.071445137" observedRunningTime="2026-04-16 16:48:07.534842224 +0000 UTC m=+20.743614535" watchObservedRunningTime="2026-04-16 16:48:07.535045881 +0000 UTC m=+20.743818192" Apr 16 16:48:07.552751 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.552709 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-kznnf" podStartSLOduration=3.240394873 podStartE2EDuration="20.552696471s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.851451721 +0000 UTC m=+3.060224009" lastFinishedPulling="2026-04-16 16:48:07.163753312 +0000 UTC m=+20.372525607" observedRunningTime="2026-04-16 16:48:07.552179968 +0000 UTC m=+20.760952281" watchObservedRunningTime="2026-04-16 16:48:07.552696471 +0000 UTC m=+20.761468783" Apr 16 16:48:07.566503 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.566469 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-srdnj" podStartSLOduration=3.545734107 podStartE2EDuration="20.566457445s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.841984906 +0000 UTC m=+3.050757196" lastFinishedPulling="2026-04-16 16:48:06.862708246 +0000 UTC m=+20.071480534" observedRunningTime="2026-04-16 16:48:07.566211582 +0000 UTC m=+20.774983895" watchObservedRunningTime="2026-04-16 16:48:07.566457445 +0000 UTC m=+20.775229757" Apr 16 16:48:07.589347 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.589298 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-fl64p" podStartSLOduration=3.572597022 podStartE2EDuration="20.589281292s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.845989301 +0000 UTC m=+3.054761602" lastFinishedPulling="2026-04-16 16:48:06.862673568 +0000 UTC m=+20.071445872" observedRunningTime="2026-04-16 16:48:07.588392342 +0000 UTC m=+20.797164653" watchObservedRunningTime="2026-04-16 16:48:07.589281292 +0000 UTC m=+20.798053605" Apr 16 16:48:07.625452 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.625412 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-8slrg" podStartSLOduration=3.597119801 podStartE2EDuration="20.625400345s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.843152892 +0000 UTC m=+3.051925183" lastFinishedPulling="2026-04-16 16:48:06.871433423 +0000 UTC m=+20.080205727" observedRunningTime="2026-04-16 16:48:07.625043705 +0000 UTC m=+20.833816018" watchObservedRunningTime="2026-04-16 16:48:07.625400345 +0000 UTC m=+20.834172656" Apr 16 16:48:07.936369 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:07.936344 2568 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:48:08.305695 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:08.305582 2568 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:48:07.93636669Z","UUID":"a21169c9-42dc-48c8-ba2c-907c23a8cb5a","Handler":null,"Name":"","Endpoint":""} Apr 16 16:48:08.307179 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:08.307160 2568 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:48:08.307179 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:08.307186 2568 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:48:08.537186 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:08.537115 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-hsvxm" event={"ID":"bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28","Type":"ContainerStarted","Data":"fff22f173ba7a9e4b36b8c72f139e57c6ea9871e9c22254815073e81908c3711"} Apr 16 16:48:08.539547 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:08.539517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" event={"ID":"635f4e42-2ddd-46d3-8a86-47f36f350728","Type":"ContainerStarted","Data":"b744199670407209d85522fd68c710f0c6894d0664e6625e33a3b3ce98fbbbff"} Apr 16 16:48:08.550806 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:08.550761 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-hsvxm" podStartSLOduration=4.522758333 podStartE2EDuration="21.550745378s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.84949047 +0000 UTC m=+3.058262760" lastFinishedPulling="2026-04-16 16:48:06.877477509 +0000 UTC m=+20.086249805" observedRunningTime="2026-04-16 16:48:08.550671035 +0000 UTC m=+21.759443347" watchObservedRunningTime="2026-04-16 16:48:08.550745378 +0000 UTC m=+21.759517691" Apr 16 16:48:09.353870 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:09.353787 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:09.354093 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:09.353838 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:09.354093 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:09.353965 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:09.354093 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:09.354068 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:09.543993 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:09.543906 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" event={"ID":"635f4e42-2ddd-46d3-8a86-47f36f350728","Type":"ContainerStarted","Data":"5dcbe566dd56125a2803b24a9697647bd63dfac36ba0959cbd61895d73ee2797"} Apr 16 16:48:09.561695 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:09.561648 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-29c96" podStartSLOduration=3.646698079 podStartE2EDuration="22.561629516s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.85046015 +0000 UTC m=+3.059232446" lastFinishedPulling="2026-04-16 16:48:08.765391581 +0000 UTC m=+21.974163883" observedRunningTime="2026-04-16 16:48:09.56126494 +0000 UTC m=+22.770037252" watchObservedRunningTime="2026-04-16 16:48:09.561629516 +0000 UTC m=+22.770401828" Apr 16 16:48:10.549097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:10.549058 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"6e061c4a4af4d20fca65b02ce0a5b6b45bc102a04c30e788c03ae9ff7ec91255"} Apr 16 16:48:11.074996 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:11.074959 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:48:11.075708 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:11.075686 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:48:11.353187 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:11.353106 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:11.353341 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:11.353109 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:11.353341 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:11.353228 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:11.353341 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:11.353329 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:11.550568 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:11.550540 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:48:11.551072 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:11.551056 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-srdnj" Apr 16 16:48:12.554700 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.554514 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" event={"ID":"2d7c70c5-3d64-495a-a048-265fbd988013","Type":"ContainerStarted","Data":"a6122e2a7899a361ea8bb15d40e2996783d66400e13230bf30a3265b6b65dce9"} Apr 16 16:48:12.555438 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.554830 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:48:12.555438 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.554874 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:48:12.556192 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.556167 2568 generic.go:358] "Generic (PLEG): container finished" podID="da105279-c3bd-4e13-9bfc-0331c0b3ebd0" containerID="3a2b99b67d549b52d83c3fb5d06d75fac2754d4b1e3b815aa0385c763575fa4f" exitCode=0 Apr 16 16:48:12.556311 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.556240 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerDied","Data":"3a2b99b67d549b52d83c3fb5d06d75fac2754d4b1e3b815aa0385c763575fa4f"} Apr 16 16:48:12.568451 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.568428 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:48:12.579684 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:12.579651 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" podStartSLOduration=8.32659257 podStartE2EDuration="25.579640126s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.850276678 +0000 UTC m=+3.059048979" lastFinishedPulling="2026-04-16 16:48:07.103324239 +0000 UTC m=+20.312096535" observedRunningTime="2026-04-16 16:48:12.579081361 +0000 UTC m=+25.787853671" watchObservedRunningTime="2026-04-16 16:48:12.579640126 +0000 UTC m=+25.788412464" Apr 16 16:48:13.353251 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.353083 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:13.353389 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.353115 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:13.353389 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:13.353352 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:13.353502 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:13.353467 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:13.560325 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.560298 2568 generic.go:358] "Generic (PLEG): container finished" podID="da105279-c3bd-4e13-9bfc-0331c0b3ebd0" containerID="49c88b5bf8215a73c7875a397dfa9cf93a830d8086330fb5c681b39a26039f52" exitCode=0 Apr 16 16:48:13.560678 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.560388 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerDied","Data":"49c88b5bf8215a73c7875a397dfa9cf93a830d8086330fb5c681b39a26039f52"} Apr 16 16:48:13.561274 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.561257 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:48:13.574965 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.574941 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:48:13.614916 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.614890 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bwvhw"] Apr 16 16:48:13.615005 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.614982 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:13.615099 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:13.615078 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:13.618016 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.617995 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gd4q4"] Apr 16 16:48:13.618091 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:13.618062 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:13.618145 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:13.618127 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:14.564099 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:14.564066 2568 generic.go:358] "Generic (PLEG): container finished" podID="da105279-c3bd-4e13-9bfc-0331c0b3ebd0" containerID="f395483f6813724d0171299a423e89a331e0443028bb9ea98d6e0401895b2e17" exitCode=0 Apr 16 16:48:14.564467 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:14.564152 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerDied","Data":"f395483f6813724d0171299a423e89a331e0443028bb9ea98d6e0401895b2e17"} Apr 16 16:48:15.353198 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:15.353165 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:15.353198 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:15.353185 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:15.353375 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:15.353282 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:15.353434 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:15.353392 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:17.354109 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:17.354078 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:17.354550 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:17.354178 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:17.354550 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:17.354237 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:17.354550 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:17.354355 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:19.353636 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.353588 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:19.354070 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:19.353716 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bwvhw" podUID="02af3612-a84f-46c2-81c2-fe094b0b75f8" Apr 16 16:48:19.356957 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.354343 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:19.356957 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:19.354538 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:48:19.592018 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.591851 2568 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-128-130.ec2.internal" event="NodeReady" Apr 16 16:48:19.592138 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.592125 2568 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:48:19.632311 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.632235 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-p4bb2"] Apr 16 16:48:19.660589 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.660557 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-dcxhn"] Apr 16 16:48:19.660752 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.660721 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.663372 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.663338 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:48:19.663372 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.663363 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-z7whv\"" Apr 16 16:48:19.663561 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.663362 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:48:19.672715 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.672695 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p4bb2"] Apr 16 16:48:19.672837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.672721 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dcxhn"] Apr 16 16:48:19.672837 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.672823 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:19.675229 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.675193 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:48:19.675541 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.675205 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:48:19.675642 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.675267 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:48:19.675642 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.675275 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s9vpx\"" Apr 16 16:48:19.790766 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.790738 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js4kd\" (UniqueName: \"kubernetes.io/projected/3e2cc599-5a01-4b24-a80c-87b34418e1b6-kube-api-access-js4kd\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.790913 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.790774 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2cc599-5a01-4b24-a80c-87b34418e1b6-config-volume\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.790913 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.790808 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:19.790913 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.790836 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.790913 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.790879 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2w78\" (UniqueName: \"kubernetes.io/projected/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-kube-api-access-l2w78\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:19.791078 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.790926 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3e2cc599-5a01-4b24-a80c-87b34418e1b6-tmp-dir\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.891469 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.891396 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:19.891630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.891503 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.891630 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:19.891513 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:19.891630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.891545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2w78\" (UniqueName: \"kubernetes.io/projected/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-kube-api-access-l2w78\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:19.891630 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:19.891590 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.391570692 +0000 UTC m=+33.600342984 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:19.891853 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.891650 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3e2cc599-5a01-4b24-a80c-87b34418e1b6-tmp-dir\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.891853 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:19.891660 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:19.891853 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.891703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-js4kd\" (UniqueName: \"kubernetes.io/projected/3e2cc599-5a01-4b24-a80c-87b34418e1b6-kube-api-access-js4kd\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.891853 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:19.891725 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:20.391709512 +0000 UTC m=+33.600481800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:19.891853 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.891762 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2cc599-5a01-4b24-a80c-87b34418e1b6-config-volume\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.896864 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.896841 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3e2cc599-5a01-4b24-a80c-87b34418e1b6-tmp-dir\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.897020 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.897004 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e2cc599-5a01-4b24-a80c-87b34418e1b6-config-volume\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.901874 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.901849 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-js4kd\" (UniqueName: \"kubernetes.io/projected/3e2cc599-5a01-4b24-a80c-87b34418e1b6-kube-api-access-js4kd\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:19.902009 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:19.901930 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2w78\" (UniqueName: \"kubernetes.io/projected/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-kube-api-access-l2w78\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:20.394405 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:20.394375 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:20.394854 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:20.394423 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:20.394854 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:20.394505 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:20.394854 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:20.394514 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:20.394854 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:20.394565 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:21.394551446 +0000 UTC m=+34.603323735 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:20.394854 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:20.394643 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:21.394621044 +0000 UTC m=+34.603393346 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:20.578142 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:20.578113 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerStarted","Data":"d35fe272d95503dddcccc02ec507311cc9be0b9e5180af664ce7276885880881"} Apr 16 16:48:20.997147 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:20.997119 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:20.997276 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:20.997229 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:20.997276 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:20.997275 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:52.997263457 +0000 UTC m=+66.206035745 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:48:21.098298 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.098273 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:21.098393 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.098386 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:48:21.098430 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.098401 2568 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:48:21.098430 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.098409 2568 projected.go:194] Error preparing data for projected volume kube-api-access-sftq5 for pod openshift-network-diagnostics/network-check-target-bwvhw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:21.098493 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.098446 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5 podName:02af3612-a84f-46c2-81c2-fe094b0b75f8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:53.098435373 +0000 UTC m=+66.307207662 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-sftq5" (UniqueName: "kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5") pod "network-check-target-bwvhw" (UID: "02af3612-a84f-46c2-81c2-fe094b0b75f8") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:48:21.353504 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.353437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:21.353656 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.353437 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:21.357206 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.357187 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:48:21.357329 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.357183 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:48:21.357329 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.357247 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-km8f8\"" Apr 16 16:48:21.357329 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.357214 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:48:21.358365 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.358347 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fz45n\"" Apr 16 16:48:21.400131 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.400112 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:21.400377 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.400139 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:21.400377 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.400228 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:21.400377 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.400237 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:21.400377 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.400269 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:23.400258034 +0000 UTC m=+36.609030322 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:21.400377 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:21.400281 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:23.400275728 +0000 UTC m=+36.609048017 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:21.581591 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.581568 2568 generic.go:358] "Generic (PLEG): container finished" podID="da105279-c3bd-4e13-9bfc-0331c0b3ebd0" containerID="d35fe272d95503dddcccc02ec507311cc9be0b9e5180af664ce7276885880881" exitCode=0 Apr 16 16:48:21.581673 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:21.581637 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerDied","Data":"d35fe272d95503dddcccc02ec507311cc9be0b9e5180af664ce7276885880881"} Apr 16 16:48:22.585870 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:22.585841 2568 generic.go:358] "Generic (PLEG): container finished" podID="da105279-c3bd-4e13-9bfc-0331c0b3ebd0" containerID="3005c88c0db5318d5046e0ca2d71c5960d9309ec2bffdb5238ce5c668bd98b64" exitCode=0 Apr 16 16:48:22.586230 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:22.585886 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerDied","Data":"3005c88c0db5318d5046e0ca2d71c5960d9309ec2bffdb5238ce5c668bd98b64"} Apr 16 16:48:23.415315 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:23.415241 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:23.415315 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:23.415275 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:23.415482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:23.415370 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:23.415482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:23.415373 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:23.415482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:23.415418 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:27.415405433 +0000 UTC m=+40.624177722 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:23.415482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:23.415433 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:27.415425893 +0000 UTC m=+40.624198189 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:23.590142 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:23.590108 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-hjnws" event={"ID":"da105279-c3bd-4e13-9bfc-0331c0b3ebd0","Type":"ContainerStarted","Data":"f017ae35ac0b264ee290231bad5a65810108526bd0dfbd3406171358143194c7"} Apr 16 16:48:23.613317 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:23.613262 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-hjnws" podStartSLOduration=6.14862968 podStartE2EDuration="36.613227156s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:47:49.846922894 +0000 UTC m=+3.055695184" lastFinishedPulling="2026-04-16 16:48:20.311520367 +0000 UTC m=+33.520292660" observedRunningTime="2026-04-16 16:48:23.612559445 +0000 UTC m=+36.821331757" watchObservedRunningTime="2026-04-16 16:48:23.613227156 +0000 UTC m=+36.821999468" Apr 16 16:48:27.441222 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:27.441188 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:27.441222 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:27.441223 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:27.441639 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:27.441316 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:27.441639 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:27.441367 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.441353384 +0000 UTC m=+48.650125678 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:27.441639 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:27.441322 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:27.441639 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:27.441462 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:35.441450541 +0000 UTC m=+48.650222830 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:35.491694 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:35.491662 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:35.491694 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:35.491700 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:35.492105 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:35.491810 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:35.492105 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:35.491819 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:35.492105 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:35.491870 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:51.491854772 +0000 UTC m=+64.700627061 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:35.492105 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:35.491883 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:48:51.491877622 +0000 UTC m=+64.700649911 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:45.576337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:45.576308 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5pfjs" Apr 16 16:48:51.591991 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:51.591954 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:48:51.591991 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:51.591993 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:48:51.592482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:51.592084 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:48:51.592482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:51.592101 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:48:51.592482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:51.592141 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:23.592126041 +0000 UTC m=+96.800898330 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:48:51.592482 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:51.592155 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:23.592148637 +0000 UTC m=+96.800920925 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:48:53.001638 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.001585 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:48:53.004258 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.004242 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:48:53.011972 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:53.011957 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:48:53.012037 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:48:53.012009 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:57.01199437 +0000 UTC m=+130.220766659 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : secret "metrics-daemon-secret" not found Apr 16 16:48:53.102200 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.102164 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:53.104730 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.104714 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:48:53.114825 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.114804 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:48:53.126170 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.126146 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sftq5\" (UniqueName: \"kubernetes.io/projected/02af3612-a84f-46c2-81c2-fe094b0b75f8-kube-api-access-sftq5\") pod \"network-check-target-bwvhw\" (UID: \"02af3612-a84f-46c2-81c2-fe094b0b75f8\") " pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:53.166537 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.166513 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-km8f8\"" Apr 16 16:48:53.174667 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.174649 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:53.292140 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.292068 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bwvhw"] Apr 16 16:48:53.295680 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:48:53.295652 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02af3612_a84f_46c2_81c2_fe094b0b75f8.slice/crio-a792b331ce1678ce77036a1b9333f49f91fcf835463805f8ca209ed050c5ab0e WatchSource:0}: Error finding container a792b331ce1678ce77036a1b9333f49f91fcf835463805f8ca209ed050c5ab0e: Status 404 returned error can't find the container with id a792b331ce1678ce77036a1b9333f49f91fcf835463805f8ca209ed050c5ab0e Apr 16 16:48:53.640257 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:53.640176 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bwvhw" event={"ID":"02af3612-a84f-46c2-81c2-fe094b0b75f8","Type":"ContainerStarted","Data":"a792b331ce1678ce77036a1b9333f49f91fcf835463805f8ca209ed050c5ab0e"} Apr 16 16:48:56.647200 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:56.647166 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bwvhw" event={"ID":"02af3612-a84f-46c2-81c2-fe094b0b75f8","Type":"ContainerStarted","Data":"b7d596bc68094e32b8a120a081cc0168455460e58472a691d7b73dfaf68cf088"} Apr 16 16:48:56.647549 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:56.647318 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:48:56.661407 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:48:56.661367 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bwvhw" podStartSLOduration=66.995249341 podStartE2EDuration="1m9.661357841s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:48:53.297872921 +0000 UTC m=+66.506645210" lastFinishedPulling="2026-04-16 16:48:55.963981416 +0000 UTC m=+69.172753710" observedRunningTime="2026-04-16 16:48:56.6604484 +0000 UTC m=+69.869220708" watchObservedRunningTime="2026-04-16 16:48:56.661357841 +0000 UTC m=+69.870130152" Apr 16 16:49:23.593511 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:23.593481 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:49:23.593894 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:23.593516 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:49:23.593894 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:23.593639 2568 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 16:49:23.593894 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:23.593647 2568 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 16:49:23.593894 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:23.593693 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls podName:3e2cc599-5a01-4b24-a80c-87b34418e1b6 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:27.593676961 +0000 UTC m=+160.802449249 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls") pod "dns-default-p4bb2" (UID: "3e2cc599-5a01-4b24-a80c-87b34418e1b6") : secret "dns-default-metrics-tls" not found Apr 16 16:49:23.593894 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:23.593716 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert podName:2a9408e4-8f5b-4f8e-b756-2d1f084e06a8 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:27.59370239 +0000 UTC m=+160.802474679 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert") pod "ingress-canary-dcxhn" (UID: "2a9408e4-8f5b-4f8e-b756-2d1f084e06a8") : secret "canary-serving-cert" not found Apr 16 16:49:27.650679 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:27.650643 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bwvhw" Apr 16 16:49:57.100731 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.100685 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:49:57.101200 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:57.100825 2568 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 16:49:57.101200 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:57.100889 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs podName:545dd230-1d90-4e1a-8615-072dd9b2d2f5 nodeName:}" failed. No retries permitted until 2026-04-16 16:51:59.100874686 +0000 UTC m=+252.309646974 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs") pod "network-metrics-daemon-gd4q4" (UID: "545dd230-1d90-4e1a-8615-072dd9b2d2f5") : secret "metrics-daemon-secret" not found Apr 16 16:49:57.863126 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.863094 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp"] Apr 16 16:49:57.864829 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.864814 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" Apr 16 16:49:57.867233 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.867215 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:49:57.868159 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.868132 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-2nst6\"" Apr 16 16:49:57.868258 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.868183 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 16:49:57.875199 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.875176 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp"] Apr 16 16:49:57.905813 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.905787 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxndf\" (UniqueName: \"kubernetes.io/projected/2d432fd0-ec22-4d57-ac82-a36eb0170cb7-kube-api-access-jxndf\") pod \"volume-data-source-validator-7d955d5dd4-bj9hp\" (UID: \"2d432fd0-ec22-4d57-ac82-a36eb0170cb7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" Apr 16 16:49:57.971160 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.971132 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-d5cmb"] Apr 16 16:49:57.972884 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.972870 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:57.975138 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.975111 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-65d8b4d484-2r794"] Apr 16 16:49:57.975516 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.975490 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 16 16:49:57.975621 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.975489 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 16 16:49:57.975844 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.975828 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 16 16:49:57.975960 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.975946 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-m594n\"" Apr 16 16:49:57.976796 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.976780 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:57.979532 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.979512 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:49:57.979644 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.979569 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 16 16:49:57.980344 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.979848 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-nznpf\"" Apr 16 16:49:57.980344 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.979912 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:49:57.980344 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.980205 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 16 16:49:57.980344 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.980312 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 16 16:49:57.980555 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.980349 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 16 16:49:57.980693 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.980670 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:49:57.983750 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.983734 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 16 16:49:57.986107 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:57.986088 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-d5cmb"] Apr 16 16:49:58.004630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.004584 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65d8b4d484-2r794"] Apr 16 16:49:58.006083 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006061 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jxndf\" (UniqueName: \"kubernetes.io/projected/2d432fd0-ec22-4d57-ac82-a36eb0170cb7-kube-api-access-jxndf\") pod \"volume-data-source-validator-7d955d5dd4-bj9hp\" (UID: \"2d432fd0-ec22-4d57-ac82-a36eb0170cb7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" Apr 16 16:49:58.006176 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006095 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.006176 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006116 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-stats-auth\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.006176 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006135 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-config\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.006176 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006158 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-trusted-ca\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.006367 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006185 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9s8n\" (UniqueName: \"kubernetes.io/projected/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-kube-api-access-n9s8n\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.006367 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006329 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-serving-cert\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.006470 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006364 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-default-certificate\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.006470 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006419 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.006470 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.006445 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7crds\" (UniqueName: \"kubernetes.io/projected/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-kube-api-access-7crds\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.016018 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.015998 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxndf\" (UniqueName: \"kubernetes.io/projected/2d432fd0-ec22-4d57-ac82-a36eb0170cb7-kube-api-access-jxndf\") pod \"volume-data-source-validator-7d955d5dd4-bj9hp\" (UID: \"2d432fd0-ec22-4d57-ac82-a36eb0170cb7\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" Apr 16 16:49:58.070868 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.070846 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-jjnfw"] Apr 16 16:49:58.072811 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.072797 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.075301 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.075276 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-nj8pt\"" Apr 16 16:49:58.075394 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.075319 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:49:58.075394 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.075344 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 16:49:58.075501 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.075347 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:49:58.075501 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.075276 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 16:49:58.080578 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.080560 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 16:49:58.082662 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.082641 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-jjnfw"] Apr 16 16:49:58.107120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107097 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.107397 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107122 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-stats-auth\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.107397 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107140 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-config\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.107397 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107157 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-tmp\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.107397 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107172 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-snapshots\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.107397 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107188 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-serving-cert\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.107619 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:58.107421 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:58.607404484 +0000 UTC m=+131.816176773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : configmap references non-existent config key: service-ca.crt Apr 16 16:49:58.107619 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107455 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-trusted-ca\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.107619 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9s8n\" (UniqueName: \"kubernetes.io/projected/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-kube-api-access-n9s8n\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.107619 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107546 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khg5m\" (UniqueName: \"kubernetes.io/projected/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-kube-api-access-khg5m\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.107619 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107577 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.107917 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107621 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-serving-cert\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.107917 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-default-certificate\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.107917 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107692 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.107917 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107802 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.107917 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.107821 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7crds\" (UniqueName: \"kubernetes.io/projected/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-kube-api-access-7crds\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.107917 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:58.107909 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:49:58.108220 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:58.107971 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:58.607954392 +0000 UTC m=+131.816726692 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : secret "router-metrics-certs-default" not found Apr 16 16:49:58.108829 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.108808 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-config\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.109292 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.109266 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-trusted-ca\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.109895 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.109873 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-default-certificate\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.109983 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.109929 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-stats-auth\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.110040 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.110000 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-serving-cert\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.117920 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.117869 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7crds\" (UniqueName: \"kubernetes.io/projected/9738e546-8b4d-4c0f-952e-9a361c4b5f7a-kube-api-access-7crds\") pod \"console-operator-d87b8d5fc-d5cmb\" (UID: \"9738e546-8b4d-4c0f-952e-9a361c4b5f7a\") " pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.118179 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.118163 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9s8n\" (UniqueName: \"kubernetes.io/projected/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-kube-api-access-n9s8n\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.173078 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.173055 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" Apr 16 16:49:58.209065 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209036 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-tmp\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209073 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-snapshots\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-serving-cert\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209150 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-khg5m\" (UniqueName: \"kubernetes.io/projected/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-kube-api-access-khg5m\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209184 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209207 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209720 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209696 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-tmp\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.209819 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.209788 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-snapshots\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.210465 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.210444 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.210749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.210730 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.211366 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.211352 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-serving-cert\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.217408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.217359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-khg5m\" (UniqueName: \"kubernetes.io/projected/25b8c537-ca17-48e3-ab8a-0d08b4ff09b7-kube-api-access-khg5m\") pod \"insights-operator-5785d4fcdd-jjnfw\" (UID: \"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7\") " pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.280757 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.280728 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp"] Apr 16 16:49:58.283448 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:49:58.283420 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d432fd0_ec22_4d57_ac82_a36eb0170cb7.slice/crio-788aa4c0567c381311e35fa7b4ba6a9bc11bf90f0818ea351df819aba08d855b WatchSource:0}: Error finding container 788aa4c0567c381311e35fa7b4ba6a9bc11bf90f0818ea351df819aba08d855b: Status 404 returned error can't find the container with id 788aa4c0567c381311e35fa7b4ba6a9bc11bf90f0818ea351df819aba08d855b Apr 16 16:49:58.283548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.283491 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:49:58.382377 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.382316 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" Apr 16 16:49:58.386975 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.386924 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-d87b8d5fc-d5cmb"] Apr 16 16:49:58.389194 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:49:58.389171 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9738e546_8b4d_4c0f_952e_9a361c4b5f7a.slice/crio-5f560d99cf55230b6dd8633f5305435958c39a9777e3ad855cfbe183e3ef3102 WatchSource:0}: Error finding container 5f560d99cf55230b6dd8633f5305435958c39a9777e3ad855cfbe183e3ef3102: Status 404 returned error can't find the container with id 5f560d99cf55230b6dd8633f5305435958c39a9777e3ad855cfbe183e3ef3102 Apr 16 16:49:58.494340 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.494311 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-jjnfw"] Apr 16 16:49:58.497436 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:49:58.497410 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25b8c537_ca17_48e3_ab8a_0d08b4ff09b7.slice/crio-6d831ad2f9e2011b2542079daf2f30d7acf23dbec53890050c30c0a4ce6a524d WatchSource:0}: Error finding container 6d831ad2f9e2011b2542079daf2f30d7acf23dbec53890050c30c0a4ce6a524d: Status 404 returned error can't find the container with id 6d831ad2f9e2011b2542079daf2f30d7acf23dbec53890050c30c0a4ce6a524d Apr 16 16:49:58.613114 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.613090 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.613206 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.613169 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:58.613245 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:58.613234 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:59.613218738 +0000 UTC m=+132.821991030 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : configmap references non-existent config key: service-ca.crt Apr 16 16:49:58.613285 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:58.613251 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:49:58.613321 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:58.613285 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:49:59.613274733 +0000 UTC m=+132.822047022 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : secret "router-metrics-certs-default" not found Apr 16 16:49:58.753854 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.753821 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" event={"ID":"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7","Type":"ContainerStarted","Data":"6d831ad2f9e2011b2542079daf2f30d7acf23dbec53890050c30c0a4ce6a524d"} Apr 16 16:49:58.754860 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.754832 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" event={"ID":"9738e546-8b4d-4c0f-952e-9a361c4b5f7a","Type":"ContainerStarted","Data":"5f560d99cf55230b6dd8633f5305435958c39a9777e3ad855cfbe183e3ef3102"} Apr 16 16:49:58.755853 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:58.755831 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" event={"ID":"2d432fd0-ec22-4d57-ac82-a36eb0170cb7","Type":"ContainerStarted","Data":"788aa4c0567c381311e35fa7b4ba6a9bc11bf90f0818ea351df819aba08d855b"} Apr 16 16:49:59.621567 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:59.621529 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:59.622022 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:59.621584 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:49:59.622022 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:59.621695 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:49:59.622022 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:59.621772 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:01.621750626 +0000 UTC m=+134.830522935 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : secret "router-metrics-certs-default" not found Apr 16 16:49:59.622022 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:49:59.621794 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:01.62178198 +0000 UTC m=+134.830554271 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : configmap references non-existent config key: service-ca.crt Apr 16 16:49:59.759799 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:59.759719 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" event={"ID":"2d432fd0-ec22-4d57-ac82-a36eb0170cb7","Type":"ContainerStarted","Data":"618dd8e0dd6ebe82743b043fd1ce3eab27c245352c33d60a8a1ac8099e1e46da"} Apr 16 16:49:59.773644 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:49:59.773584 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-bj9hp" podStartSLOduration=1.575808156 podStartE2EDuration="2.773567264s" podCreationTimestamp="2026-04-16 16:49:57 +0000 UTC" firstStartedPulling="2026-04-16 16:49:58.285076581 +0000 UTC m=+131.493848874" lastFinishedPulling="2026-04-16 16:49:59.482835677 +0000 UTC m=+132.691607982" observedRunningTime="2026-04-16 16:49:59.773034614 +0000 UTC m=+132.981806940" watchObservedRunningTime="2026-04-16 16:49:59.773567264 +0000 UTC m=+132.982339576" Apr 16 16:50:00.372321 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.372280 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l"] Apr 16 16:50:00.374289 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.374269 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" Apr 16 16:50:00.376806 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.376782 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-tv4z7\"" Apr 16 16:50:00.384406 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.384388 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l"] Apr 16 16:50:00.430778 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.430746 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crhd5\" (UniqueName: \"kubernetes.io/projected/6ab091fc-21fb-49e0-b30a-43ff44f2e808-kube-api-access-crhd5\") pod \"network-check-source-7b678d77c7-zr45l\" (UID: \"6ab091fc-21fb-49e0-b30a-43ff44f2e808\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" Apr 16 16:50:00.531432 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.531400 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-crhd5\" (UniqueName: \"kubernetes.io/projected/6ab091fc-21fb-49e0-b30a-43ff44f2e808-kube-api-access-crhd5\") pod \"network-check-source-7b678d77c7-zr45l\" (UID: \"6ab091fc-21fb-49e0-b30a-43ff44f2e808\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" Apr 16 16:50:00.540724 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.540699 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-crhd5\" (UniqueName: \"kubernetes.io/projected/6ab091fc-21fb-49e0-b30a-43ff44f2e808-kube-api-access-crhd5\") pod \"network-check-source-7b678d77c7-zr45l\" (UID: \"6ab091fc-21fb-49e0-b30a-43ff44f2e808\") " pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" Apr 16 16:50:00.684095 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.684070 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" Apr 16 16:50:00.764969 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.764933 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" event={"ID":"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7","Type":"ContainerStarted","Data":"6b39d4c9bc69aa57cdb2336c3bb11462bfc47cb60bd497364a33969200e144c4"} Apr 16 16:50:00.767344 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.767301 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/0.log" Apr 16 16:50:00.767575 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.767531 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" event={"ID":"9738e546-8b4d-4c0f-952e-9a361c4b5f7a","Type":"ContainerStarted","Data":"97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4"} Apr 16 16:50:00.767803 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.767788 2568 scope.go:117] "RemoveContainer" containerID="97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4" Apr 16 16:50:00.789742 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.789691 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" podStartSLOduration=0.648594767 podStartE2EDuration="2.789674157s" podCreationTimestamp="2026-04-16 16:49:58 +0000 UTC" firstStartedPulling="2026-04-16 16:49:58.499328293 +0000 UTC m=+131.708100582" lastFinishedPulling="2026-04-16 16:50:00.640407667 +0000 UTC m=+133.849179972" observedRunningTime="2026-04-16 16:50:00.788618803 +0000 UTC m=+133.997391117" watchObservedRunningTime="2026-04-16 16:50:00.789674157 +0000 UTC m=+133.998446470" Apr 16 16:50:00.808269 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:00.808246 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l"] Apr 16 16:50:00.825677 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:00.825649 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6ab091fc_21fb_49e0_b30a_43ff44f2e808.slice/crio-0c6b812318a70369a74d02d2a36464a0db44b22661717707f0d33c714b2a7aeb WatchSource:0}: Error finding container 0c6b812318a70369a74d02d2a36464a0db44b22661717707f0d33c714b2a7aeb: Status 404 returned error can't find the container with id 0c6b812318a70369a74d02d2a36464a0db44b22661717707f0d33c714b2a7aeb Apr 16 16:50:01.642207 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.642174 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:01.642358 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.642213 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:01.642358 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:01.642335 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:50:01.642456 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:01.642383 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.642369353 +0000 UTC m=+138.851141929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : configmap references non-existent config key: service-ca.crt Apr 16 16:50:01.642456 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:01.642399 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:05.642393273 +0000 UTC m=+138.851165561 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : secret "router-metrics-certs-default" not found Apr 16 16:50:01.771760 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.771734 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/1.log" Apr 16 16:50:01.772149 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772132 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/0.log" Apr 16 16:50:01.772363 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772175 2568 generic.go:358] "Generic (PLEG): container finished" podID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" containerID="97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4" exitCode=255 Apr 16 16:50:01.772363 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772190 2568 generic.go:358] "Generic (PLEG): container finished" podID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" containerID="cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac" exitCode=255 Apr 16 16:50:01.772363 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772263 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" event={"ID":"9738e546-8b4d-4c0f-952e-9a361c4b5f7a","Type":"ContainerDied","Data":"97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4"} Apr 16 16:50:01.772363 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772300 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" event={"ID":"9738e546-8b4d-4c0f-952e-9a361c4b5f7a","Type":"ContainerDied","Data":"cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac"} Apr 16 16:50:01.772363 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772321 2568 scope.go:117] "RemoveContainer" containerID="97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4" Apr 16 16:50:01.772630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.772472 2568 scope.go:117] "RemoveContainer" containerID="cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac" Apr 16 16:50:01.772688 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:01.772665 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-d5cmb_openshift-console-operator(9738e546-8b4d-4c0f-952e-9a361c4b5f7a)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podUID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" Apr 16 16:50:01.773776 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.773752 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" event={"ID":"6ab091fc-21fb-49e0-b30a-43ff44f2e808","Type":"ContainerStarted","Data":"025556b59880aea28b18408db2666d20208ac3d69f7c54d7280be3bd8e2c3e70"} Apr 16 16:50:01.773861 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.773789 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" event={"ID":"6ab091fc-21fb-49e0-b30a-43ff44f2e808","Type":"ContainerStarted","Data":"0c6b812318a70369a74d02d2a36464a0db44b22661717707f0d33c714b2a7aeb"} Apr 16 16:50:01.781576 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.781553 2568 scope.go:117] "RemoveContainer" containerID="97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4" Apr 16 16:50:01.781869 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:01.781849 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4\": container with ID starting with 97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4 not found: ID does not exist" containerID="97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4" Apr 16 16:50:01.782020 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.781976 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4"} err="failed to get container status \"97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4\": rpc error: code = NotFound desc = could not find container \"97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4\": container with ID starting with 97ac4b9c9b614bd04023b70cc4666270a8d498f14df52c5a4f1d8865847d24d4 not found: ID does not exist" Apr 16 16:50:01.809656 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:01.809553 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7b678d77c7-zr45l" podStartSLOduration=1.809542345 podStartE2EDuration="1.809542345s" podCreationTimestamp="2026-04-16 16:50:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:01.809130984 +0000 UTC m=+135.017903297" watchObservedRunningTime="2026-04-16 16:50:01.809542345 +0000 UTC m=+135.018314656" Apr 16 16:50:02.777110 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:02.777087 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/1.log" Apr 16 16:50:02.777548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:02.777534 2568 scope.go:117] "RemoveContainer" containerID="cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac" Apr 16 16:50:02.777716 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:02.777699 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-d5cmb_openshift-console-operator(9738e546-8b4d-4c0f-952e-9a361c4b5f7a)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podUID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" Apr 16 16:50:05.164134 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:05.164103 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v87xm_b60bf4ff-6c52-4d90-9a63-32a829bfc83e/dns-node-resolver/0.log" Apr 16 16:50:05.670793 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:05.670751 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:05.670793 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:05.670799 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:05.671008 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:05.670895 2568 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 16 16:50:05.671008 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:05.670937 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:13.670920367 +0000 UTC m=+146.879692671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : configmap references non-existent config key: service-ca.crt Apr 16 16:50:05.671008 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:05.670953 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs podName:41054f43-1b5b-46d6-9aab-24d8b6ff5a23 nodeName:}" failed. No retries permitted until 2026-04-16 16:50:13.670946686 +0000 UTC m=+146.879718975 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs") pod "router-default-65d8b4d484-2r794" (UID: "41054f43-1b5b-46d6-9aab-24d8b6ff5a23") : secret "router-metrics-certs-default" not found Apr 16 16:50:05.763881 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:05.763858 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8slrg_5d59ae73-e194-4f66-9f72-a091634a4c01/node-ca/0.log" Apr 16 16:50:08.284379 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:08.284351 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:50:08.284379 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:08.284385 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:50:08.284800 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:08.284713 2568 scope.go:117] "RemoveContainer" containerID="cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac" Apr 16 16:50:08.284880 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:08.284862 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-d87b8d5fc-d5cmb_openshift-console-operator(9738e546-8b4d-4c0f-952e-9a361c4b5f7a)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podUID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" Apr 16 16:50:13.727199 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:13.727162 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:13.727672 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:13.727203 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:13.727761 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:13.727744 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-service-ca-bundle\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:13.729469 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:13.729441 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/41054f43-1b5b-46d6-9aab-24d8b6ff5a23-metrics-certs\") pod \"router-default-65d8b4d484-2r794\" (UID: \"41054f43-1b5b-46d6-9aab-24d8b6ff5a23\") " pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:13.889897 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:13.889869 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:14.002277 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:14.002203 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-65d8b4d484-2r794"] Apr 16 16:50:14.005498 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:14.005468 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41054f43_1b5b_46d6_9aab_24d8b6ff5a23.slice/crio-2b82576666641eb26b30e1f8a8bbf4d2156ef89b6d6947bfa1d2b4747b9982d3 WatchSource:0}: Error finding container 2b82576666641eb26b30e1f8a8bbf4d2156ef89b6d6947bfa1d2b4747b9982d3: Status 404 returned error can't find the container with id 2b82576666641eb26b30e1f8a8bbf4d2156ef89b6d6947bfa1d2b4747b9982d3 Apr 16 16:50:14.800516 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:14.800481 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65d8b4d484-2r794" event={"ID":"41054f43-1b5b-46d6-9aab-24d8b6ff5a23","Type":"ContainerStarted","Data":"6b9a4cf35163004cbc96c4ebbdedfab8abbcb1ef344c83feca2422604d24b4a2"} Apr 16 16:50:14.800516 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:14.800517 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-65d8b4d484-2r794" event={"ID":"41054f43-1b5b-46d6-9aab-24d8b6ff5a23","Type":"ContainerStarted","Data":"2b82576666641eb26b30e1f8a8bbf4d2156ef89b6d6947bfa1d2b4747b9982d3"} Apr 16 16:50:14.820095 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:14.820054 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-65d8b4d484-2r794" podStartSLOduration=17.820040556 podStartE2EDuration="17.820040556s" podCreationTimestamp="2026-04-16 16:49:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:50:14.818271422 +0000 UTC m=+148.027043733" watchObservedRunningTime="2026-04-16 16:50:14.820040556 +0000 UTC m=+148.028812867" Apr 16 16:50:14.890742 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:14.890713 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:14.893188 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:14.893167 2568 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:15.802909 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:15.802869 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:15.804071 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:15.804049 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-65d8b4d484-2r794" Apr 16 16:50:22.672319 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:22.672275 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-p4bb2" podUID="3e2cc599-5a01-4b24-a80c-87b34418e1b6" Apr 16 16:50:22.688468 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:22.688448 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-dcxhn" podUID="2a9408e4-8f5b-4f8e-b756-2d1f084e06a8" Apr 16 16:50:22.818130 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:22.818105 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p4bb2" Apr 16 16:50:23.353819 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.353794 2568 scope.go:117] "RemoveContainer" containerID="cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac" Apr 16 16:50:23.822338 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.822313 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 16:50:23.822736 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.822672 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/1.log" Apr 16 16:50:23.822736 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.822700 2568 generic.go:358] "Generic (PLEG): container finished" podID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" containerID="4a4354562f469dfec2ec2893541b3645c8eff22750614455bb9d65cb2c6bea54" exitCode=255 Apr 16 16:50:23.822736 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.822728 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" event={"ID":"9738e546-8b4d-4c0f-952e-9a361c4b5f7a","Type":"ContainerDied","Data":"4a4354562f469dfec2ec2893541b3645c8eff22750614455bb9d65cb2c6bea54"} Apr 16 16:50:23.822859 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.822754 2568 scope.go:117] "RemoveContainer" containerID="cbedb00cd911967909b6b8e6954ded86ddf25efda4138574f762eafb495330ac" Apr 16 16:50:23.823057 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:23.823037 2568 scope.go:117] "RemoveContainer" containerID="4a4354562f469dfec2ec2893541b3645c8eff22750614455bb9d65cb2c6bea54" Apr 16 16:50:23.823225 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:23.823201 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-d5cmb_openshift-console-operator(9738e546-8b4d-4c0f-952e-9a361c4b5f7a)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podUID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" Apr 16 16:50:24.368967 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:24.368923 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-gd4q4" podUID="545dd230-1d90-4e1a-8615-072dd9b2d2f5" Apr 16 16:50:24.825509 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:24.825485 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 16:50:25.429126 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.429096 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tjmzf"] Apr 16 16:50:25.433641 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.433618 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.437218 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.437196 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:50:25.437508 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.437242 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-w2v8f\"" Apr 16 16:50:25.437508 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.437282 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:50:25.448316 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.448294 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tjmzf"] Apr 16 16:50:25.503769 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.503747 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/406fd5cc-e864-49b6-a250-066e0a57b355-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.503858 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.503775 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sp4zl\" (UniqueName: \"kubernetes.io/projected/406fd5cc-e864-49b6-a250-066e0a57b355-kube-api-access-sp4zl\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.503858 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.503805 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/406fd5cc-e864-49b6-a250-066e0a57b355-crio-socket\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.503936 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.503894 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/406fd5cc-e864-49b6-a250-066e0a57b355-data-volume\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.503936 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.503921 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/406fd5cc-e864-49b6-a250-066e0a57b355-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604539 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604514 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/406fd5cc-e864-49b6-a250-066e0a57b355-data-volume\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604661 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604545 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/406fd5cc-e864-49b6-a250-066e0a57b355-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604661 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604640 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/406fd5cc-e864-49b6-a250-066e0a57b355-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604767 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604665 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sp4zl\" (UniqueName: \"kubernetes.io/projected/406fd5cc-e864-49b6-a250-066e0a57b355-kube-api-access-sp4zl\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604767 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604722 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/406fd5cc-e864-49b6-a250-066e0a57b355-crio-socket\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604862 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604821 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/406fd5cc-e864-49b6-a250-066e0a57b355-crio-socket\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.604901 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.604888 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/406fd5cc-e864-49b6-a250-066e0a57b355-data-volume\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.605086 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.605070 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/406fd5cc-e864-49b6-a250-066e0a57b355-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.606884 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.606865 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/406fd5cc-e864-49b6-a250-066e0a57b355-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.616159 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.616132 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sp4zl\" (UniqueName: \"kubernetes.io/projected/406fd5cc-e864-49b6-a250-066e0a57b355-kube-api-access-sp4zl\") pod \"insights-runtime-extractor-tjmzf\" (UID: \"406fd5cc-e864-49b6-a250-066e0a57b355\") " pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.743681 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.743662 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tjmzf" Apr 16 16:50:25.858147 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:25.858120 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tjmzf"] Apr 16 16:50:25.861141 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:25.861113 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod406fd5cc_e864_49b6_a250_066e0a57b355.slice/crio-f48634c78a8330c560c11ad9d0362275debe7ce314508bc2694aa7dc3cce3239 WatchSource:0}: Error finding container f48634c78a8330c560c11ad9d0362275debe7ce314508bc2694aa7dc3cce3239: Status 404 returned error can't find the container with id f48634c78a8330c560c11ad9d0362275debe7ce314508bc2694aa7dc3cce3239 Apr 16 16:50:26.831785 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:26.831747 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjmzf" event={"ID":"406fd5cc-e864-49b6-a250-066e0a57b355","Type":"ContainerStarted","Data":"9b75cdcbe15a5985e4a4a0278d5004f1325258e0daf4a3dd89da5d70a6487926"} Apr 16 16:50:26.831922 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:26.831791 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjmzf" event={"ID":"406fd5cc-e864-49b6-a250-066e0a57b355","Type":"ContainerStarted","Data":"d09f5fe561640a56c563c61b9464e999a4236c22904a6b68ac57830cd55620b6"} Apr 16 16:50:26.831922 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:26.831810 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjmzf" event={"ID":"406fd5cc-e864-49b6-a250-066e0a57b355","Type":"ContainerStarted","Data":"f48634c78a8330c560c11ad9d0362275debe7ce314508bc2694aa7dc3cce3239"} Apr 16 16:50:27.620462 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.620428 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:50:27.620462 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.620464 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:50:27.622566 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.622542 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e2cc599-5a01-4b24-a80c-87b34418e1b6-metrics-tls\") pod \"dns-default-p4bb2\" (UID: \"3e2cc599-5a01-4b24-a80c-87b34418e1b6\") " pod="openshift-dns/dns-default-p4bb2" Apr 16 16:50:27.622806 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.622788 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2a9408e4-8f5b-4f8e-b756-2d1f084e06a8-cert\") pod \"ingress-canary-dcxhn\" (UID: \"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8\") " pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:50:27.836017 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.835988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tjmzf" event={"ID":"406fd5cc-e864-49b6-a250-066e0a57b355","Type":"ContainerStarted","Data":"14bcdff9b3efc68550b7227a652d92851b28549b90936c8943c21088aa377f39"} Apr 16 16:50:27.857587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.857546 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tjmzf" podStartSLOduration=1.113459995 podStartE2EDuration="2.857533017s" podCreationTimestamp="2026-04-16 16:50:25 +0000 UTC" firstStartedPulling="2026-04-16 16:50:25.911583062 +0000 UTC m=+159.120355352" lastFinishedPulling="2026-04-16 16:50:27.655656085 +0000 UTC m=+160.864428374" observedRunningTime="2026-04-16 16:50:27.856861671 +0000 UTC m=+161.065633995" watchObservedRunningTime="2026-04-16 16:50:27.857533017 +0000 UTC m=+161.066305329" Apr 16 16:50:27.924642 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.924581 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-z7whv\"" Apr 16 16:50:27.929585 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:27.929570 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-p4bb2" Apr 16 16:50:28.037395 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:28.037366 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-p4bb2"] Apr 16 16:50:28.040558 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:28.040532 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e2cc599_5a01_4b24_a80c_87b34418e1b6.slice/crio-dba289c648eb46bda90b3aad0a9f4d1b09b008673b36140763611f27d6bf7aac WatchSource:0}: Error finding container dba289c648eb46bda90b3aad0a9f4d1b09b008673b36140763611f27d6bf7aac: Status 404 returned error can't find the container with id dba289c648eb46bda90b3aad0a9f4d1b09b008673b36140763611f27d6bf7aac Apr 16 16:50:28.283787 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:28.283761 2568 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:50:28.283900 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:28.283795 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:50:28.284128 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:28.284113 2568 scope.go:117] "RemoveContainer" containerID="4a4354562f469dfec2ec2893541b3645c8eff22750614455bb9d65cb2c6bea54" Apr 16 16:50:28.284313 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:28.284286 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-d5cmb_openshift-console-operator(9738e546-8b4d-4c0f-952e-9a361c4b5f7a)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podUID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" Apr 16 16:50:28.840978 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:28.840943 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p4bb2" event={"ID":"3e2cc599-5a01-4b24-a80c-87b34418e1b6","Type":"ContainerStarted","Data":"dba289c648eb46bda90b3aad0a9f4d1b09b008673b36140763611f27d6bf7aac"} Apr 16 16:50:29.844977 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:29.844941 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p4bb2" event={"ID":"3e2cc599-5a01-4b24-a80c-87b34418e1b6","Type":"ContainerStarted","Data":"9ec78c8fcb94a377b513c36b0ee4d8aa3f3b5b7d52e07e58161762b52ac50164"} Apr 16 16:50:29.844977 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:29.844983 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-p4bb2" event={"ID":"3e2cc599-5a01-4b24-a80c-87b34418e1b6","Type":"ContainerStarted","Data":"0a326035157ab7abf08872fc92164f1a7945af87c954062ffb393e6d3ffc2da3"} Apr 16 16:50:29.845360 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:29.845073 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-p4bb2" Apr 16 16:50:29.861908 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:29.861864 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-p4bb2" podStartSLOduration=129.748502556 podStartE2EDuration="2m10.861852085s" podCreationTimestamp="2026-04-16 16:48:19 +0000 UTC" firstStartedPulling="2026-04-16 16:50:28.042274963 +0000 UTC m=+161.251047253" lastFinishedPulling="2026-04-16 16:50:29.155624488 +0000 UTC m=+162.364396782" observedRunningTime="2026-04-16 16:50:29.860254615 +0000 UTC m=+163.069026926" watchObservedRunningTime="2026-04-16 16:50:29.861852085 +0000 UTC m=+163.070624391" Apr 16 16:50:34.192337 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.192308 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-dd9tc"] Apr 16 16:50:34.195504 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.195490 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.197946 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.197924 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 16:50:34.198946 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.198926 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:50:34.199059 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.198926 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 16:50:34.199059 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.198990 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:50:34.199059 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.198997 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:50:34.199224 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.198952 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-fc2md\"" Apr 16 16:50:34.203747 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.203726 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-dd9tc"] Apr 16 16:50:34.265567 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.265545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c06b334-8334-4e0d-bece-cf0e4c09fd87-metrics-client-ca\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.265669 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.265572 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c06b334-8334-4e0d-bece-cf0e4c09fd87-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.265669 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.265613 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdj5\" (UniqueName: \"kubernetes.io/projected/7c06b334-8334-4e0d-bece-cf0e4c09fd87-kube-api-access-5jdj5\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.265742 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.265680 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c06b334-8334-4e0d-bece-cf0e4c09fd87-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.366579 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.366557 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c06b334-8334-4e0d-bece-cf0e4c09fd87-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.366677 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.366590 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c06b334-8334-4e0d-bece-cf0e4c09fd87-metrics-client-ca\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.366677 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.366634 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c06b334-8334-4e0d-bece-cf0e4c09fd87-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.366677 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.366660 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdj5\" (UniqueName: \"kubernetes.io/projected/7c06b334-8334-4e0d-bece-cf0e4c09fd87-kube-api-access-5jdj5\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.367272 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.367253 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c06b334-8334-4e0d-bece-cf0e4c09fd87-metrics-client-ca\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.368969 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.368938 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c06b334-8334-4e0d-bece-cf0e4c09fd87-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.369042 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.368996 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7c06b334-8334-4e0d-bece-cf0e4c09fd87-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.375080 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.375062 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdj5\" (UniqueName: \"kubernetes.io/projected/7c06b334-8334-4e0d-bece-cf0e4c09fd87-kube-api-access-5jdj5\") pod \"prometheus-operator-78f957474d-dd9tc\" (UID: \"7c06b334-8334-4e0d-bece-cf0e4c09fd87\") " pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.504373 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.504350 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" Apr 16 16:50:34.613915 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.613887 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-dd9tc"] Apr 16 16:50:34.616581 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:34.616555 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c06b334_8334_4e0d_bece_cf0e4c09fd87.slice/crio-51fed467db0d2823f6e522e45f72fecd90c9bedf3486ed382d521dbec5b8e2fe WatchSource:0}: Error finding container 51fed467db0d2823f6e522e45f72fecd90c9bedf3486ed382d521dbec5b8e2fe: Status 404 returned error can't find the container with id 51fed467db0d2823f6e522e45f72fecd90c9bedf3486ed382d521dbec5b8e2fe Apr 16 16:50:34.857203 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:34.857129 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" event={"ID":"7c06b334-8334-4e0d-bece-cf0e4c09fd87","Type":"ContainerStarted","Data":"51fed467db0d2823f6e522e45f72fecd90c9bedf3486ed382d521dbec5b8e2fe"} Apr 16 16:50:35.353751 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.353712 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:50:35.358610 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.356956 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-s9vpx\"" Apr 16 16:50:35.364056 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.364030 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-dcxhn" Apr 16 16:50:35.771161 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.771135 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-dcxhn"] Apr 16 16:50:35.774521 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:35.774492 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a9408e4_8f5b_4f8e_b756_2d1f084e06a8.slice/crio-1253996fe5ad77e106dff256f8622f9388c9cbe6a7d2aefdec9004dd56798045 WatchSource:0}: Error finding container 1253996fe5ad77e106dff256f8622f9388c9cbe6a7d2aefdec9004dd56798045: Status 404 returned error can't find the container with id 1253996fe5ad77e106dff256f8622f9388c9cbe6a7d2aefdec9004dd56798045 Apr 16 16:50:35.860839 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.860812 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dcxhn" event={"ID":"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8","Type":"ContainerStarted","Data":"1253996fe5ad77e106dff256f8622f9388c9cbe6a7d2aefdec9004dd56798045"} Apr 16 16:50:35.862344 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.862320 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" event={"ID":"7c06b334-8334-4e0d-bece-cf0e4c09fd87","Type":"ContainerStarted","Data":"872ad35d39e0472a58210cb8698186de2e2b835d8b018a04a7ac66c3fb826267"} Apr 16 16:50:35.862449 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.862347 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" event={"ID":"7c06b334-8334-4e0d-bece-cf0e4c09fd87","Type":"ContainerStarted","Data":"db9879ed3d837434fa7fcfdd5280dc54d3e46ccc804abf369d0e1a51c9a234a6"} Apr 16 16:50:35.880412 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:35.880372 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-dd9tc" podStartSLOduration=0.806034644 podStartE2EDuration="1.880359576s" podCreationTimestamp="2026-04-16 16:50:34 +0000 UTC" firstStartedPulling="2026-04-16 16:50:34.61831826 +0000 UTC m=+167.827090549" lastFinishedPulling="2026-04-16 16:50:35.692643178 +0000 UTC m=+168.901415481" observedRunningTime="2026-04-16 16:50:35.879435145 +0000 UTC m=+169.088207455" watchObservedRunningTime="2026-04-16 16:50:35.880359576 +0000 UTC m=+169.089131904" Apr 16 16:50:37.630440 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.630410 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-2gdfq"] Apr 16 16:50:37.633378 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.633362 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-xmbwz"] Apr 16 16:50:37.633534 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.633515 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.636408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.636388 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.637868 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.637850 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:50:37.637951 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.637851 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-v9dwt\"" Apr 16 16:50:37.637951 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.637890 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:50:37.638160 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.638148 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:50:37.638765 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.638742 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:50:37.638765 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.638757 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:50:37.639006 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.638993 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-hp9nk\"" Apr 16 16:50:37.639504 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.639491 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:50:37.649991 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.649971 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-2gdfq"] Apr 16 16:50:37.794019 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.793981 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794019 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794018 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-textfile\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794271 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794044 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e647148-eee0-4e1c-a69d-27b49175a4fb-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.794271 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794110 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7e647148-eee0-4e1c-a69d-27b49175a4fb-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.794271 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794126 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-wtmp\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794271 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794144 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-root\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794271 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794208 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7nn9\" (UniqueName: \"kubernetes.io/projected/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-kube-api-access-r7nn9\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794271 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794259 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-metrics-client-ca\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794310 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4d69\" (UniqueName: \"kubernetes.io/projected/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-api-access-t4d69\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794352 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-tls\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794408 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-accelerators-collector-config\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794468 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-sys\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.794563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.794535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.868982 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.868953 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-dcxhn" event={"ID":"2a9408e4-8f5b-4f8e-b756-2d1f084e06a8","Type":"ContainerStarted","Data":"6f9a8575e457298b8d6118cc1ec0a454855ffc29f60a5279bd88b065ff530cf4"} Apr 16 16:50:37.885112 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.885070 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-dcxhn" podStartSLOduration=137.245589091 podStartE2EDuration="2m18.885058547s" podCreationTimestamp="2026-04-16 16:48:19 +0000 UTC" firstStartedPulling="2026-04-16 16:50:35.776523616 +0000 UTC m=+168.985295912" lastFinishedPulling="2026-04-16 16:50:37.415993072 +0000 UTC m=+170.624765368" observedRunningTime="2026-04-16 16:50:37.884245513 +0000 UTC m=+171.093017814" watchObservedRunningTime="2026-04-16 16:50:37.885058547 +0000 UTC m=+171.093830857" Apr 16 16:50:37.895002 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.894981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-metrics-client-ca\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.895112 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t4d69\" (UniqueName: \"kubernetes.io/projected/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-api-access-t4d69\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.895112 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895041 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-tls\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.895223 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895143 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-accelerators-collector-config\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.895223 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895177 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.895223 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895205 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.895370 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895234 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-sys\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.895370 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895295 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-sys\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.895370 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:37.895323 2568 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 16:50:37.895504 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:37.895393 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-tls podName:7e647148-eee0-4e1c-a69d-27b49175a4fb nodeName:}" failed. No retries permitted until 2026-04-16 16:50:38.395374332 +0000 UTC m=+171.604146623 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-2gdfq" (UID: "7e647148-eee0-4e1c-a69d-27b49175a4fb") : secret "kube-state-metrics-tls" not found Apr 16 16:50:37.895504 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.895419 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896132 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896191 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-textfile\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896231 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e647148-eee0-4e1c-a69d-27b49175a4fb-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896312 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7e647148-eee0-4e1c-a69d-27b49175a4fb-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896399 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-wtmp\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896438 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-root\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896553 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-accelerators-collector-config\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896682 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-metrics-client-ca\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896826 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-textfile\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.896938 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.897086 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-wtmp\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.897162 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-root\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.897431 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7nn9\" (UniqueName: \"kubernetes.io/projected/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-kube-api-access-r7nn9\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.897945 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7e647148-eee0-4e1c-a69d-27b49175a4fb-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.899943 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-tls\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.903283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.903057 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7e647148-eee0-4e1c-a69d-27b49175a4fb-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.904895 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.904876 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.905303 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.905282 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.910122 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.910101 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4d69\" (UniqueName: \"kubernetes.io/projected/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-api-access-t4d69\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:37.910694 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.910674 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7nn9\" (UniqueName: \"kubernetes.io/projected/f8fc4c1b-8a9b-474a-ab6d-391214553bf2-kube-api-access-r7nn9\") pod \"node-exporter-xmbwz\" (UID: \"f8fc4c1b-8a9b-474a-ab6d-391214553bf2\") " pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.948543 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:37.948521 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-xmbwz" Apr 16 16:50:37.955867 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:37.955841 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fc4c1b_8a9b_474a_ab6d_391214553bf2.slice/crio-5673518df90fe27f8acecb30e53dc1782dc8ad895fcfe7c7a836876563f2f96b WatchSource:0}: Error finding container 5673518df90fe27f8acecb30e53dc1782dc8ad895fcfe7c7a836876563f2f96b: Status 404 returned error can't find the container with id 5673518df90fe27f8acecb30e53dc1782dc8ad895fcfe7c7a836876563f2f96b Apr 16 16:50:38.402155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.402117 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:38.404949 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.404924 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7e647148-eee0-4e1c-a69d-27b49175a4fb-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-2gdfq\" (UID: \"7e647148-eee0-4e1c-a69d-27b49175a4fb\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:38.543701 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.543671 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" Apr 16 16:50:38.639930 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.639899 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:38.644256 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.644221 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.646835 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.646811 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:50:38.647643 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.647621 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:50:38.647980 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.647961 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fpfk2\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648153 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648164 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648206 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648164 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648402 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648468 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:50:38.648587 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.648522 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:50:38.657751 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.657733 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:38.718354 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.718332 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-2gdfq"] Apr 16 16:50:38.720929 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:38.720902 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e647148_eee0_4e1c_a69d_27b49175a4fb.slice/crio-f8241bafd0a1ba6575c499543d22ad3f6ce1af56fb54cda074fbc816bce55105 WatchSource:0}: Error finding container f8241bafd0a1ba6575c499543d22ad3f6ce1af56fb54cda074fbc816bce55105: Status 404 returned error can't find the container with id f8241bafd0a1ba6575c499543d22ad3f6ce1af56fb54cda074fbc816bce55105 Apr 16 16:50:38.804952 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.804924 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-config-out\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805092 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.804959 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805092 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.804978 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805092 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805025 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805242 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805154 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805242 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805186 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-web-config\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805346 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805260 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805346 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805290 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-config-volume\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805346 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805317 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805488 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805347 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zp62\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-kube-api-access-9zp62\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805488 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805416 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805488 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805459 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.805626 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.805493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.872922 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.872887 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" event={"ID":"7e647148-eee0-4e1c-a69d-27b49175a4fb","Type":"ContainerStarted","Data":"f8241bafd0a1ba6575c499543d22ad3f6ce1af56fb54cda074fbc816bce55105"} Apr 16 16:50:38.874242 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.874217 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8fc4c1b-8a9b-474a-ab6d-391214553bf2" containerID="cb006b34bfafce92768a35485be8e454ecea124a9198e57080ffa40c199deaf0" exitCode=0 Apr 16 16:50:38.874351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.874291 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmbwz" event={"ID":"f8fc4c1b-8a9b-474a-ab6d-391214553bf2","Type":"ContainerDied","Data":"cb006b34bfafce92768a35485be8e454ecea124a9198e57080ffa40c199deaf0"} Apr 16 16:50:38.874351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.874324 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmbwz" event={"ID":"f8fc4c1b-8a9b-474a-ab6d-391214553bf2","Type":"ContainerStarted","Data":"5673518df90fe27f8acecb30e53dc1782dc8ad895fcfe7c7a836876563f2f96b"} Apr 16 16:50:38.906563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906506 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906540 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-config-out\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906735 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906773 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906740 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906810 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906770 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906858 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906809 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906858 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906836 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-web-config\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906944 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906892 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.906944 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906933 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-config-volume\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.907041 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906957 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.907041 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.906975 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9zp62\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-kube-api-access-9zp62\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.907041 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.907007 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.907189 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.907076 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.908035 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.907740 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.908311 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.908261 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.909438 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.909413 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.909527 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.909478 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.909903 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.909875 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-config-volume\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.910031 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.910005 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-web-config\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.910102 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.910084 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-config-out\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.910454 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.910434 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.910514 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.910495 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.910660 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.910644 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.911499 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.911484 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.915818 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.915791 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zp62\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-kube-api-access-9zp62\") pod \"alertmanager-main-0\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:38.957028 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:38.957008 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:50:39.081672 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.081636 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:50:39.084433 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:50:39.084407 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8e9a400_4b51_4992_a52c_107b311a987d.slice/crio-34bf4d0b2c59de23a9075cf4748cad9bc612d97509b8920768ee9381a05aaa67 WatchSource:0}: Error finding container 34bf4d0b2c59de23a9075cf4748cad9bc612d97509b8920768ee9381a05aaa67: Status 404 returned error can't find the container with id 34bf4d0b2c59de23a9075cf4748cad9bc612d97509b8920768ee9381a05aaa67 Apr 16 16:50:39.353228 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.353193 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:50:39.850740 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.850711 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-p4bb2" Apr 16 16:50:39.879817 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.879782 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmbwz" event={"ID":"f8fc4c1b-8a9b-474a-ab6d-391214553bf2","Type":"ContainerStarted","Data":"23f38a65ca7773d5d4b3209ae090ba3775c6d6222a837e02bac4421c96be2a7f"} Apr 16 16:50:39.879986 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.879828 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-xmbwz" event={"ID":"f8fc4c1b-8a9b-474a-ab6d-391214553bf2","Type":"ContainerStarted","Data":"20fc84b32d1f655106bf76196d113b6648ab0c907c9f5333ae534073eb54d579"} Apr 16 16:50:39.881363 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.881322 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"34bf4d0b2c59de23a9075cf4748cad9bc612d97509b8920768ee9381a05aaa67"} Apr 16 16:50:39.900095 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:39.900006 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-xmbwz" podStartSLOduration=2.212874194 podStartE2EDuration="2.899985346s" podCreationTimestamp="2026-04-16 16:50:37 +0000 UTC" firstStartedPulling="2026-04-16 16:50:37.957471966 +0000 UTC m=+171.166244255" lastFinishedPulling="2026-04-16 16:50:38.644583103 +0000 UTC m=+171.853355407" observedRunningTime="2026-04-16 16:50:39.89966243 +0000 UTC m=+173.108434743" watchObservedRunningTime="2026-04-16 16:50:39.899985346 +0000 UTC m=+173.108757658" Apr 16 16:50:40.885747 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:40.885708 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" event={"ID":"7e647148-eee0-4e1c-a69d-27b49175a4fb","Type":"ContainerStarted","Data":"c507428baca43d5228683a2b439e3b253c9200b662c71c6df9322e4442601df4"} Apr 16 16:50:40.886155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:40.885753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" event={"ID":"7e647148-eee0-4e1c-a69d-27b49175a4fb","Type":"ContainerStarted","Data":"cafc4ee3406cb28536b08be7a1a063f8a39cbf518470911a9c212ce820d167c7"} Apr 16 16:50:40.886155 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:40.885770 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" event={"ID":"7e647148-eee0-4e1c-a69d-27b49175a4fb","Type":"ContainerStarted","Data":"15c5a95832c423ce6ee2de94074207ef0f0cba2c4b039056aa2feae96572d996"} Apr 16 16:50:40.887025 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:40.887002 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="a79649e707850b6441302bb538189eb358f0dc8dd3f7e417a8cfcd38595a1994" exitCode=0 Apr 16 16:50:40.887109 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:40.887090 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"a79649e707850b6441302bb538189eb358f0dc8dd3f7e417a8cfcd38595a1994"} Apr 16 16:50:40.906468 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:40.906427 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-2gdfq" podStartSLOduration=2.558395406 podStartE2EDuration="3.906414377s" podCreationTimestamp="2026-04-16 16:50:37 +0000 UTC" firstStartedPulling="2026-04-16 16:50:38.722724385 +0000 UTC m=+171.931496675" lastFinishedPulling="2026-04-16 16:50:40.070743354 +0000 UTC m=+173.279515646" observedRunningTime="2026-04-16 16:50:40.905544415 +0000 UTC m=+174.114316726" watchObservedRunningTime="2026-04-16 16:50:40.906414377 +0000 UTC m=+174.115186688" Apr 16 16:50:42.353780 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:42.353763 2568 scope.go:117] "RemoveContainer" containerID="4a4354562f469dfec2ec2893541b3645c8eff22750614455bb9d65cb2c6bea54" Apr 16 16:50:42.354062 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:50:42.353924 2568 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-d87b8d5fc-d5cmb_openshift-console-operator(9738e546-8b4d-4c0f-952e-9a361c4b5f7a)\"" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podUID="9738e546-8b4d-4c0f-952e-9a361c4b5f7a" Apr 16 16:50:42.896330 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:42.896296 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"66fbd33765e554af3f02aac27dcc6635a22b75495416fb5504e69162f958caa1"} Apr 16 16:50:42.896330 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:42.896338 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"00d110d14e584d678bb8dbbef771a6fd89881b984838580ec56f798c11e3b737"} Apr 16 16:50:42.896565 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:42.896351 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"99988cc4114ab97743da23ee22c02491779a3117fd42b8d4937057b4dd0dc7e8"} Apr 16 16:50:42.896565 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:42.896362 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"dc5a40604fa919a7b38d6618b21afe9449d023a0b7f60d47d3d6959371f0f138"} Apr 16 16:50:42.896565 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:42.896375 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"ec9d225586d6ffaae72322ebbd6f16d0e36821fd6e75f3b09cfe9e72fed70e57"} Apr 16 16:50:43.901335 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:43.901297 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerStarted","Data":"78905c47fc9becc9636d67c14b401192a61f214559351f05c3a6cffeedf08428"} Apr 16 16:50:43.936270 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:43.936203 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.0625781930000002 podStartE2EDuration="5.936184852s" podCreationTimestamp="2026-04-16 16:50:38 +0000 UTC" firstStartedPulling="2026-04-16 16:50:39.0862556 +0000 UTC m=+172.295027889" lastFinishedPulling="2026-04-16 16:50:42.959862256 +0000 UTC m=+176.168634548" observedRunningTime="2026-04-16 16:50:43.934321097 +0000 UTC m=+177.143093412" watchObservedRunningTime="2026-04-16 16:50:43.936184852 +0000 UTC m=+177.144957163" Apr 16 16:50:56.353170 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:56.353143 2568 scope.go:117] "RemoveContainer" containerID="4a4354562f469dfec2ec2893541b3645c8eff22750614455bb9d65cb2c6bea54" Apr 16 16:50:56.934980 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:56.934956 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 16:50:56.935143 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:56.935010 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" event={"ID":"9738e546-8b4d-4c0f-952e-9a361c4b5f7a","Type":"ContainerStarted","Data":"682a1e82f7af36845111225cc513dd81c49a3cabe2a6f9383676d226e14a9b18"} Apr 16 16:50:56.935285 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:56.935266 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:50:56.939960 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:56.939938 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" Apr 16 16:50:56.955297 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:50:56.955249 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-d87b8d5fc-d5cmb" podStartSLOduration=57.706927167 podStartE2EDuration="59.955237249s" podCreationTimestamp="2026-04-16 16:49:57 +0000 UTC" firstStartedPulling="2026-04-16 16:49:58.39101612 +0000 UTC m=+131.599788413" lastFinishedPulling="2026-04-16 16:50:00.639326203 +0000 UTC m=+133.848098495" observedRunningTime="2026-04-16 16:50:56.954420064 +0000 UTC m=+190.163192376" watchObservedRunningTime="2026-04-16 16:50:56.955237249 +0000 UTC m=+190.164009559" Apr 16 16:51:06.962251 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:06.962218 2568 generic.go:358] "Generic (PLEG): container finished" podID="25b8c537-ca17-48e3-ab8a-0d08b4ff09b7" containerID="6b39d4c9bc69aa57cdb2336c3bb11462bfc47cb60bd497364a33969200e144c4" exitCode=0 Apr 16 16:51:06.962654 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:06.962298 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" event={"ID":"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7","Type":"ContainerDied","Data":"6b39d4c9bc69aa57cdb2336c3bb11462bfc47cb60bd497364a33969200e144c4"} Apr 16 16:51:06.962654 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:06.962626 2568 scope.go:117] "RemoveContainer" containerID="6b39d4c9bc69aa57cdb2336c3bb11462bfc47cb60bd497364a33969200e144c4" Apr 16 16:51:07.342729 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:07.342645 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d8b4d484-2r794_41054f43-1b5b-46d6-9aab-24d8b6ff5a23/router/0.log" Apr 16 16:51:07.521748 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:07.521722 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dcxhn_2a9408e4-8f5b-4f8e-b756-2d1f084e06a8/serve-healthcheck-canary/0.log" Apr 16 16:51:07.966824 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:07.966795 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-jjnfw" event={"ID":"25b8c537-ca17-48e3-ab8a-0d08b4ff09b7","Type":"ContainerStarted","Data":"3c45dc2f54fb5f13c605b6de1122dacc003caebb89fb0dcd00e3d6e9455c6977"} Apr 16 16:51:57.815657 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.815624 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:51:57.816128 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.816031 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="alertmanager" containerID="cri-o://ec9d225586d6ffaae72322ebbd6f16d0e36821fd6e75f3b09cfe9e72fed70e57" gracePeriod=120 Apr 16 16:51:57.816128 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.816099 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="prom-label-proxy" containerID="cri-o://78905c47fc9becc9636d67c14b401192a61f214559351f05c3a6cffeedf08428" gracePeriod=120 Apr 16 16:51:57.816128 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.816084 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-metric" containerID="cri-o://66fbd33765e554af3f02aac27dcc6635a22b75495416fb5504e69162f958caa1" gracePeriod=120 Apr 16 16:51:57.816290 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.816111 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-web" containerID="cri-o://99988cc4114ab97743da23ee22c02491779a3117fd42b8d4937057b4dd0dc7e8" gracePeriod=120 Apr 16 16:51:57.816290 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.816202 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="config-reloader" containerID="cri-o://dc5a40604fa919a7b38d6618b21afe9449d023a0b7f60d47d3d6959371f0f138" gracePeriod=120 Apr 16 16:51:57.816290 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:57.816216 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy" containerID="cri-o://00d110d14e584d678bb8dbbef771a6fd89881b984838580ec56f798c11e3b737" gracePeriod=120 Apr 16 16:51:58.106120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106044 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="78905c47fc9becc9636d67c14b401192a61f214559351f05c3a6cffeedf08428" exitCode=0 Apr 16 16:51:58.106120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106066 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="00d110d14e584d678bb8dbbef771a6fd89881b984838580ec56f798c11e3b737" exitCode=0 Apr 16 16:51:58.106120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106074 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="dc5a40604fa919a7b38d6618b21afe9449d023a0b7f60d47d3d6959371f0f138" exitCode=0 Apr 16 16:51:58.106120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106083 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="ec9d225586d6ffaae72322ebbd6f16d0e36821fd6e75f3b09cfe9e72fed70e57" exitCode=0 Apr 16 16:51:58.106120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106112 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"78905c47fc9becc9636d67c14b401192a61f214559351f05c3a6cffeedf08428"} Apr 16 16:51:58.106362 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106145 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"00d110d14e584d678bb8dbbef771a6fd89881b984838580ec56f798c11e3b737"} Apr 16 16:51:58.106362 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106157 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"dc5a40604fa919a7b38d6618b21afe9449d023a0b7f60d47d3d6959371f0f138"} Apr 16 16:51:58.106362 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:58.106167 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"ec9d225586d6ffaae72322ebbd6f16d0e36821fd6e75f3b09cfe9e72fed70e57"} Apr 16 16:51:59.105089 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.105057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:51:59.107381 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.107360 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/545dd230-1d90-4e1a-8615-072dd9b2d2f5-metrics-certs\") pod \"network-metrics-daemon-gd4q4\" (UID: \"545dd230-1d90-4e1a-8615-072dd9b2d2f5\") " pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:51:59.117631 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.117559 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="66fbd33765e554af3f02aac27dcc6635a22b75495416fb5504e69162f958caa1" exitCode=0 Apr 16 16:51:59.117631 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.117589 2568 generic.go:358] "Generic (PLEG): container finished" podID="f8e9a400-4b51-4992-a52c-107b311a987d" containerID="99988cc4114ab97743da23ee22c02491779a3117fd42b8d4937057b4dd0dc7e8" exitCode=0 Apr 16 16:51:59.117806 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.117630 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"66fbd33765e554af3f02aac27dcc6635a22b75495416fb5504e69162f958caa1"} Apr 16 16:51:59.117806 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.117659 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"99988cc4114ab97743da23ee22c02491779a3117fd42b8d4937057b4dd0dc7e8"} Apr 16 16:51:59.156068 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.156051 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:51:59.156183 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.156169 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-fz45n\"" Apr 16 16:51:59.164536 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.164520 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-gd4q4" Apr 16 16:51:59.282306 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.282281 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-gd4q4"] Apr 16 16:51:59.284754 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:51:59.284730 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545dd230_1d90_4e1a_8615_072dd9b2d2f5.slice/crio-6e9f51e57b8eb05be8cccab0614aff2b529da517b5d8bd39076afdfe9d40021a WatchSource:0}: Error finding container 6e9f51e57b8eb05be8cccab0614aff2b529da517b5d8bd39076afdfe9d40021a: Status 404 returned error can't find the container with id 6e9f51e57b8eb05be8cccab0614aff2b529da517b5d8bd39076afdfe9d40021a Apr 16 16:51:59.306910 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.306884 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-main-db\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307042 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.306921 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-main-tls\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307042 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.306939 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307042 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.306970 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-web\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307209 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307112 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-metrics-client-ca\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307209 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307168 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-config-out\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307209 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307196 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-trusted-ca-bundle\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307226 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9zp62\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-kube-api-access-9zp62\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307248 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:59.307351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307259 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-cluster-tls-config\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307290 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-tls-assets\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307351 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307330 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-web-config\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307627 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307369 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-config-volume\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307627 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307397 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-metric\") pod \"f8e9a400-4b51-4992-a52c-107b311a987d\" (UID: \"f8e9a400-4b51-4992-a52c-107b311a987d\") " Apr 16 16:51:59.307627 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307553 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:51:59.307790 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307648 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-main-db\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.307790 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307666 2568 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-metrics-client-ca\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.307790 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.307677 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:51:59.309826 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.309782 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.310073 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.310040 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.310169 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.310142 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.310317 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.310295 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.310476 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.310454 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-kube-api-access-9zp62" (OuterVolumeSpecName: "kube-api-access-9zp62") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "kube-api-access-9zp62". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:59.310670 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.310654 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-config-out" (OuterVolumeSpecName: "config-out") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:51:59.310809 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.310788 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:51:59.311295 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.311283 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-config-volume" (OuterVolumeSpecName: "config-volume") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.313861 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.313832 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.319536 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.319517 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-web-config" (OuterVolumeSpecName: "web-config") pod "f8e9a400-4b51-4992-a52c-107b311a987d" (UID: "f8e9a400-4b51-4992-a52c-107b311a987d"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:51:59.408749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408690 2568 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8e9a400-4b51-4992-a52c-107b311a987d-config-out\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408712 2568 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8e9a400-4b51-4992-a52c-107b311a987d-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408725 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9zp62\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-kube-api-access-9zp62\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408734 2568 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-cluster-tls-config\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408743 2568 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8e9a400-4b51-4992-a52c-107b311a987d-tls-assets\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408749 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408752 2568 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-web-config\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408988 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408760 2568 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-config-volume\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408988 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408769 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408988 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408778 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-main-tls\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408988 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408787 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:51:59.408988 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:51:59.408796 2568 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8e9a400-4b51-4992-a52c-107b311a987d-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:52:00.121941 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.121857 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gd4q4" event={"ID":"545dd230-1d90-4e1a-8615-072dd9b2d2f5","Type":"ContainerStarted","Data":"6e9f51e57b8eb05be8cccab0614aff2b529da517b5d8bd39076afdfe9d40021a"} Apr 16 16:52:00.124821 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.124792 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f8e9a400-4b51-4992-a52c-107b311a987d","Type":"ContainerDied","Data":"34bf4d0b2c59de23a9075cf4748cad9bc612d97509b8920768ee9381a05aaa67"} Apr 16 16:52:00.124968 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.124839 2568 scope.go:117] "RemoveContainer" containerID="78905c47fc9becc9636d67c14b401192a61f214559351f05c3a6cffeedf08428" Apr 16 16:52:00.124968 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.124877 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.143964 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.143903 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:52:00.148260 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.148225 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:52:00.169865 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.169840 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:52:00.170174 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170159 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="config-reloader" Apr 16 16:52:00.170174 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170175 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="config-reloader" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170187 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="alertmanager" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170192 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="alertmanager" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170205 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-web" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170211 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-web" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170220 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170228 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170238 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-metric" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170246 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-metric" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170259 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="init-config-reloader" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170268 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="init-config-reloader" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170277 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="prom-label-proxy" Apr 16 16:52:00.170314 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170285 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="prom-label-proxy" Apr 16 16:52:00.170781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170345 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="alertmanager" Apr 16 16:52:00.170781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170354 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="config-reloader" Apr 16 16:52:00.170781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170362 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-web" Apr 16 16:52:00.170781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170368 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy" Apr 16 16:52:00.170781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170375 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="kube-rbac-proxy-metric" Apr 16 16:52:00.170781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.170383 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" containerName="prom-label-proxy" Apr 16 16:52:00.174023 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.173998 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.176481 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176350 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 16 16:52:00.176481 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176350 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 16 16:52:00.176672 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176515 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 16 16:52:00.176735 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176700 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-fpfk2\"" Apr 16 16:52:00.176783 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176759 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 16 16:52:00.176831 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176780 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 16 16:52:00.176831 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176790 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 16 16:52:00.176916 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.176882 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 16 16:52:00.177033 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.177015 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 16 16:52:00.181432 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.181412 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 16 16:52:00.187510 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.187488 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:52:00.289365 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.289344 2568 scope.go:117] "RemoveContainer" containerID="66fbd33765e554af3f02aac27dcc6635a22b75495416fb5504e69162f958caa1" Apr 16 16:52:00.295991 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.295969 2568 scope.go:117] "RemoveContainer" containerID="00d110d14e584d678bb8dbbef771a6fd89881b984838580ec56f798c11e3b737" Apr 16 16:52:00.302224 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.302208 2568 scope.go:117] "RemoveContainer" containerID="99988cc4114ab97743da23ee22c02491779a3117fd42b8d4937057b4dd0dc7e8" Apr 16 16:52:00.308102 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.308085 2568 scope.go:117] "RemoveContainer" containerID="dc5a40604fa919a7b38d6618b21afe9449d023a0b7f60d47d3d6959371f0f138" Apr 16 16:52:00.316185 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316167 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316262 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316192 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316262 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316210 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316262 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316226 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-web-config\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316262 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316244 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-config-volume\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316431 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316301 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316431 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316363 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316431 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316420 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316441 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-config-out\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316461 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316477 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316497 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.316548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.316545 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sznzj\" (UniqueName: \"kubernetes.io/projected/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-kube-api-access-sznzj\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.332349 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.332330 2568 scope.go:117] "RemoveContainer" containerID="ec9d225586d6ffaae72322ebbd6f16d0e36821fd6e75f3b09cfe9e72fed70e57" Apr 16 16:52:00.338492 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.338475 2568 scope.go:117] "RemoveContainer" containerID="a79649e707850b6441302bb538189eb358f0dc8dd3f7e417a8cfcd38595a1994" Apr 16 16:52:00.416973 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.416949 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sznzj\" (UniqueName: \"kubernetes.io/projected/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-kube-api-access-sznzj\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417109 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417005 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417109 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417031 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417109 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417057 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417109 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417084 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-web-config\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417110 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-config-volume\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417133 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417159 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417199 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417230 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-config-out\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417264 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417307 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417292 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.417664 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417320 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.418092 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417806 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.418092 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.417843 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.418433 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.418408 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.420245 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.420187 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-tls-assets\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.420333 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.420298 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-config-volume\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.420781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.420749 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.420952 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.420924 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-config-out\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.421035 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.421018 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.421205 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.421186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.421278 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.421186 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.421502 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.421482 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-web-config\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.422341 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.422322 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.424483 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.424460 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sznzj\" (UniqueName: \"kubernetes.io/projected/a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64-kube-api-access-sznzj\") pod \"alertmanager-main-0\" (UID: \"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64\") " pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.485834 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.485810 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 16 16:52:00.624726 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:00.624701 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 16 16:52:00.626309 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:52:00.626277 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1ab25dd_1a13_4c31_8f4f_4f9b5c5ccd64.slice/crio-1780f55a9d87ec6fc9fdc108321bd4984092ecbc285c14f491d03c6c70dd3028 WatchSource:0}: Error finding container 1780f55a9d87ec6fc9fdc108321bd4984092ecbc285c14f491d03c6c70dd3028: Status 404 returned error can't find the container with id 1780f55a9d87ec6fc9fdc108321bd4984092ecbc285c14f491d03c6c70dd3028 Apr 16 16:52:01.129513 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.129477 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gd4q4" event={"ID":"545dd230-1d90-4e1a-8615-072dd9b2d2f5","Type":"ContainerStarted","Data":"7b5d8befd1ee2d6ce600be215f23e786767738ede18563c7dae6e4a61e9be104"} Apr 16 16:52:01.130060 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.130035 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-gd4q4" event={"ID":"545dd230-1d90-4e1a-8615-072dd9b2d2f5","Type":"ContainerStarted","Data":"ff3793aa5538d822852d5a1a5a1edfdb564491a7fa4926b317d7c397ade09426"} Apr 16 16:52:01.132082 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.132058 2568 generic.go:358] "Generic (PLEG): container finished" podID="a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64" containerID="0f64ebc53133e0594962425d86dd8539a2680bbc74cf33ab56464d4eae7d64e0" exitCode=0 Apr 16 16:52:01.132211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.132153 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerDied","Data":"0f64ebc53133e0594962425d86dd8539a2680bbc74cf33ab56464d4eae7d64e0"} Apr 16 16:52:01.132211 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.132191 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"1780f55a9d87ec6fc9fdc108321bd4984092ecbc285c14f491d03c6c70dd3028"} Apr 16 16:52:01.148784 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.148747 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-gd4q4" podStartSLOduration=253.093124316 podStartE2EDuration="4m14.14873459s" podCreationTimestamp="2026-04-16 16:47:47 +0000 UTC" firstStartedPulling="2026-04-16 16:51:59.28640202 +0000 UTC m=+252.495174320" lastFinishedPulling="2026-04-16 16:52:00.34201229 +0000 UTC m=+253.550784594" observedRunningTime="2026-04-16 16:52:01.146818033 +0000 UTC m=+254.355590347" watchObservedRunningTime="2026-04-16 16:52:01.14873459 +0000 UTC m=+254.357506902" Apr 16 16:52:01.358303 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.358278 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8e9a400-4b51-4992-a52c-107b311a987d" path="/var/lib/kubelet/pods/f8e9a400-4b51-4992-a52c-107b311a987d/volumes" Apr 16 16:52:01.870928 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.870891 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95"] Apr 16 16:52:01.874699 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.874672 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:01.878748 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.878724 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 16:52:01.878748 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.878743 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 16:52:01.878931 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.878724 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 16:52:01.879355 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.879337 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 16:52:01.879408 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.879389 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-8gghf\"" Apr 16 16:52:01.880151 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.880129 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 16:52:01.884783 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.884764 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 16:52:01.891508 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:01.891485 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95"] Apr 16 16:52:02.031180 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031145 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-federate-client-tls\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031418 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tjwsz\" (UniqueName: \"kubernetes.io/projected/d0138511-2f89-4a9b-b233-0c90f6ac88af-kube-api-access-tjwsz\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031492 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031440 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031492 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031465 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-metrics-client-ca\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031616 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031495 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031675 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031633 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-secret-telemeter-client\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031724 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031705 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-telemeter-client-tls\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.031774 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.031739 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-serving-certs-ca-bundle\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.132740 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132666 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-telemeter-client-tls\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.132740 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132703 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-serving-certs-ca-bundle\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.132740 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132724 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-federate-client-tls\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.132740 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132739 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tjwsz\" (UniqueName: \"kubernetes.io/projected/d0138511-2f89-4a9b-b233-0c90f6ac88af-kube-api-access-tjwsz\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.133304 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132760 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.133304 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132958 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-metrics-client-ca\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.133304 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.132999 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.133304 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.133095 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-secret-telemeter-client\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.133507 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.133480 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-serving-certs-ca-bundle\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.133676 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.133648 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-metrics-client-ca\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.134342 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.134307 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d0138511-2f89-4a9b-b233-0c90f6ac88af-telemeter-trusted-ca-bundle\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.135753 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.135727 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.136047 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.136022 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-secret-telemeter-client\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.136475 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.136452 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-federate-client-tls\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.137788 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.137768 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/d0138511-2f89-4a9b-b233-0c90f6ac88af-telemeter-client-tls\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.141102 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.141076 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"cf83aee96c226ad186bab6251069472ab3edf262d69af7ff586c74b7fb8d788e"} Apr 16 16:52:02.141212 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.141122 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"e47181448876c98c8519145cf192de07588e293e64737b511667894f1da02cf8"} Apr 16 16:52:02.141212 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.141135 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"30ba582fa28995058c69bd8d7ba07e846126f9eba90bc808de3bc70188e37504"} Apr 16 16:52:02.141212 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.141149 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"69293a7ce725e3aa5c62ef2310a96d2cb8f7ce316da69781da56123f87f3aa4a"} Apr 16 16:52:02.141212 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.141162 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"a9c380ec8d70edc20049ceafbba045a041ae6f2304841223af6a8e21f0a09d72"} Apr 16 16:52:02.141212 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.141172 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64","Type":"ContainerStarted","Data":"ad080fec6c201cee00c784813beb8efe9715933561bd56fe20beaad85e8cb656"} Apr 16 16:52:02.149192 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.149170 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tjwsz\" (UniqueName: \"kubernetes.io/projected/d0138511-2f89-4a9b-b233-0c90f6ac88af-kube-api-access-tjwsz\") pod \"telemeter-client-5fd8c9f77b-cbb95\" (UID: \"d0138511-2f89-4a9b-b233-0c90f6ac88af\") " pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.181548 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.179043 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.179018811 podStartE2EDuration="2.179018811s" podCreationTimestamp="2026-04-16 16:52:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:52:02.172542571 +0000 UTC m=+255.381314887" watchObservedRunningTime="2026-04-16 16:52:02.179018811 +0000 UTC m=+255.387791124" Apr 16 16:52:02.184833 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.184802 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" Apr 16 16:52:02.316164 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:02.316128 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95"] Apr 16 16:52:02.319263 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:52:02.319235 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0138511_2f89_4a9b_b233_0c90f6ac88af.slice/crio-4d405fb2549360eb70c2d1699490052edcb2dfb250d96ef603ad93199ec6c569 WatchSource:0}: Error finding container 4d405fb2549360eb70c2d1699490052edcb2dfb250d96ef603ad93199ec6c569: Status 404 returned error can't find the container with id 4d405fb2549360eb70c2d1699490052edcb2dfb250d96ef603ad93199ec6c569 Apr 16 16:52:03.145538 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:03.145502 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" event={"ID":"d0138511-2f89-4a9b-b233-0c90f6ac88af","Type":"ContainerStarted","Data":"4d405fb2549360eb70c2d1699490052edcb2dfb250d96ef603ad93199ec6c569"} Apr 16 16:52:04.149891 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:04.149856 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" event={"ID":"d0138511-2f89-4a9b-b233-0c90f6ac88af","Type":"ContainerStarted","Data":"d8f92c198e7b0277026dadc81cafdedc25b051e0d4a079a90fe069cda06d3743"} Apr 16 16:52:04.149891 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:04.149893 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" event={"ID":"d0138511-2f89-4a9b-b233-0c90f6ac88af","Type":"ContainerStarted","Data":"bf34c0967c63cf9622356ad2470ba50b5bc37146d8aa910d8aaf448e30f5f6e4"} Apr 16 16:52:04.150282 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:04.149902 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" event={"ID":"d0138511-2f89-4a9b-b233-0c90f6ac88af","Type":"ContainerStarted","Data":"f06a327e4e2234642326198d46b1d87fc511497d3bdb868dc2f3e0c4b2517707"} Apr 16 16:52:04.171647 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:04.171560 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-5fd8c9f77b-cbb95" podStartSLOduration=1.788630912 podStartE2EDuration="3.171543678s" podCreationTimestamp="2026-04-16 16:52:01 +0000 UTC" firstStartedPulling="2026-04-16 16:52:02.321085029 +0000 UTC m=+255.529857318" lastFinishedPulling="2026-04-16 16:52:03.70399779 +0000 UTC m=+256.912770084" observedRunningTime="2026-04-16 16:52:04.170119396 +0000 UTC m=+257.378891707" watchObservedRunningTime="2026-04-16 16:52:04.171543678 +0000 UTC m=+257.380316000" Apr 16 16:52:47.251689 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:47.251662 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 16:52:47.252071 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:52:47.251942 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 16:54:02.412260 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.412190 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-8vlsd"] Apr 16 16:54:02.414393 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.414374 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.416897 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.416878 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:54:02.424226 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.424202 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8vlsd"] Apr 16 16:54:02.559784 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.559751 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/be58460c-a5db-43bc-bafb-93a72415c5ea-dbus\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.559915 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.559809 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58460c-a5db-43bc-bafb-93a72415c5ea-original-pull-secret\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.559915 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.559853 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/be58460c-a5db-43bc-bafb-93a72415c5ea-kubelet-config\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.661050 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.661010 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58460c-a5db-43bc-bafb-93a72415c5ea-original-pull-secret\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.661208 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.661067 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/be58460c-a5db-43bc-bafb-93a72415c5ea-kubelet-config\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.661208 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.661099 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/be58460c-a5db-43bc-bafb-93a72415c5ea-dbus\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.661208 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.661196 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/be58460c-a5db-43bc-bafb-93a72415c5ea-kubelet-config\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.661373 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.661229 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/be58460c-a5db-43bc-bafb-93a72415c5ea-dbus\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.663120 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.663068 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/be58460c-a5db-43bc-bafb-93a72415c5ea-original-pull-secret\") pod \"global-pull-secret-syncer-8vlsd\" (UID: \"be58460c-a5db-43bc-bafb-93a72415c5ea\") " pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.723491 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.723468 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-8vlsd" Apr 16 16:54:02.838530 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.838485 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-8vlsd"] Apr 16 16:54:02.841738 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:02.841714 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbe58460c_a5db_43bc_bafb_93a72415c5ea.slice/crio-55080dc23f569366f2af184bdcb5cc15f404ce66359d9ad4836f6db404b15f2d WatchSource:0}: Error finding container 55080dc23f569366f2af184bdcb5cc15f404ce66359d9ad4836f6db404b15f2d: Status 404 returned error can't find the container with id 55080dc23f569366f2af184bdcb5cc15f404ce66359d9ad4836f6db404b15f2d Apr 16 16:54:02.843205 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:02.843182 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:54:03.470511 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:03.470475 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8vlsd" event={"ID":"be58460c-a5db-43bc-bafb-93a72415c5ea","Type":"ContainerStarted","Data":"55080dc23f569366f2af184bdcb5cc15f404ce66359d9ad4836f6db404b15f2d"} Apr 16 16:54:07.483482 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:07.483445 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-8vlsd" event={"ID":"be58460c-a5db-43bc-bafb-93a72415c5ea","Type":"ContainerStarted","Data":"643915b3294100cd29bd7e23f458bf531e86dba064c80960636dfed9d592b486"} Apr 16 16:54:07.504543 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:07.504498 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-8vlsd" podStartSLOduration=1.5058145710000002 podStartE2EDuration="5.50448356s" podCreationTimestamp="2026-04-16 16:54:02 +0000 UTC" firstStartedPulling="2026-04-16 16:54:02.843336257 +0000 UTC m=+376.052108546" lastFinishedPulling="2026-04-16 16:54:06.842005246 +0000 UTC m=+380.050777535" observedRunningTime="2026-04-16 16:54:07.502700099 +0000 UTC m=+380.711472431" watchObservedRunningTime="2026-04-16 16:54:07.50448356 +0000 UTC m=+380.713255872" Apr 16 16:54:14.147134 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.147105 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7"] Apr 16 16:54:14.149265 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.149243 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.151777 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.151754 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 16:54:14.151874 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.151754 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 16:54:14.152242 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.152228 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 16:54:14.153049 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.153031 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 16:54:14.161668 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.161649 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7"] Apr 16 16:54:14.248368 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.248345 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-tmp\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.248459 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.248398 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ks65\" (UniqueName: \"kubernetes.io/projected/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-kube-api-access-2ks65\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.248459 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.248443 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-klusterlet-config\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.349189 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.349165 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ks65\" (UniqueName: \"kubernetes.io/projected/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-kube-api-access-2ks65\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.349288 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.349200 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-klusterlet-config\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.349288 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.349238 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-tmp\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.349640 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.349619 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-tmp\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.351550 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.351529 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-klusterlet-config\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.356589 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.356572 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ks65\" (UniqueName: \"kubernetes.io/projected/d7c9a3fc-fc97-4ee4-8e09-729045598ec7-kube-api-access-2ks65\") pod \"klusterlet-addon-workmgr-764b58ccf-vlpb7\" (UID: \"d7c9a3fc-fc97-4ee4-8e09-729045598ec7\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.458283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.458260 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:14.577050 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:14.577022 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7"] Apr 16 16:54:14.580407 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:14.580366 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd7c9a3fc_fc97_4ee4_8e09_729045598ec7.slice/crio-7aa5c5f65aaf44da3cb56d2d50f86ba3ac99c369ddf7c86e90d0eded4c2808ad WatchSource:0}: Error finding container 7aa5c5f65aaf44da3cb56d2d50f86ba3ac99c369ddf7c86e90d0eded4c2808ad: Status 404 returned error can't find the container with id 7aa5c5f65aaf44da3cb56d2d50f86ba3ac99c369ddf7c86e90d0eded4c2808ad Apr 16 16:54:15.506831 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:15.506789 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" event={"ID":"d7c9a3fc-fc97-4ee4-8e09-729045598ec7","Type":"ContainerStarted","Data":"7aa5c5f65aaf44da3cb56d2d50f86ba3ac99c369ddf7c86e90d0eded4c2808ad"} Apr 16 16:54:18.517697 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:18.517662 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" event={"ID":"d7c9a3fc-fc97-4ee4-8e09-729045598ec7","Type":"ContainerStarted","Data":"bd158ebcdf78f70788cec148962d87324fa7f30758956bdf313a095843df0699"} Apr 16 16:54:18.518149 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:18.517970 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:18.519467 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:18.519446 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" Apr 16 16:54:18.532550 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:18.532512 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-764b58ccf-vlpb7" podStartSLOduration=1.045463883 podStartE2EDuration="4.532499932s" podCreationTimestamp="2026-04-16 16:54:14 +0000 UTC" firstStartedPulling="2026-04-16 16:54:14.5822368 +0000 UTC m=+387.791009089" lastFinishedPulling="2026-04-16 16:54:18.069272845 +0000 UTC m=+391.278045138" observedRunningTime="2026-04-16 16:54:18.531421877 +0000 UTC m=+391.740194188" watchObservedRunningTime="2026-04-16 16:54:18.532499932 +0000 UTC m=+391.741272247" Apr 16 16:54:22.886412 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.886377 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6"] Apr 16 16:54:22.888817 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.888800 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:22.891418 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.891396 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 16 16:54:22.892435 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.892420 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 16 16:54:22.892526 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.892446 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-djn9b\"" Apr 16 16:54:22.896859 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.896839 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6"] Apr 16 16:54:22.917165 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.917139 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-28prr\" (UniqueName: \"kubernetes.io/projected/05e2a1af-4924-4219-9c0f-93d12da22e3f-kube-api-access-28prr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:22.917260 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.917175 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:22.917260 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:22.917200 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.018439 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.018410 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-28prr\" (UniqueName: \"kubernetes.io/projected/05e2a1af-4924-4219-9c0f-93d12da22e3f-kube-api-access-28prr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.018547 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.018451 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.018547 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.018494 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.018856 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.018837 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.018921 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.018868 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.026578 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.026555 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-28prr\" (UniqueName: \"kubernetes.io/projected/05e2a1af-4924-4219-9c0f-93d12da22e3f-kube-api-access-28prr\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.198399 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.198377 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:23.310166 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.310102 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6"] Apr 16 16:54:23.312314 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:23.312287 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05e2a1af_4924_4219_9c0f_93d12da22e3f.slice/crio-bc61d293b00aff6999898fa5c6059e10b41e39e3dd319a01cde5a8508081012d WatchSource:0}: Error finding container bc61d293b00aff6999898fa5c6059e10b41e39e3dd319a01cde5a8508081012d: Status 404 returned error can't find the container with id bc61d293b00aff6999898fa5c6059e10b41e39e3dd319a01cde5a8508081012d Apr 16 16:54:23.533781 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:23.533753 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" event={"ID":"05e2a1af-4924-4219-9c0f-93d12da22e3f","Type":"ContainerStarted","Data":"bc61d293b00aff6999898fa5c6059e10b41e39e3dd319a01cde5a8508081012d"} Apr 16 16:54:28.549288 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:28.549260 2568 generic.go:358] "Generic (PLEG): container finished" podID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerID="6e88bff2649831c2a840883dbe8e7567f7b269ab1056b923007de5fcce0b3564" exitCode=0 Apr 16 16:54:28.549696 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:28.549305 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" event={"ID":"05e2a1af-4924-4219-9c0f-93d12da22e3f","Type":"ContainerDied","Data":"6e88bff2649831c2a840883dbe8e7567f7b269ab1056b923007de5fcce0b3564"} Apr 16 16:54:30.556593 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:30.556563 2568 generic.go:358] "Generic (PLEG): container finished" podID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerID="8011f580ace9eb86d2e6a4743f18da855e26e90d923dabfa1cb096ce5485d15d" exitCode=0 Apr 16 16:54:30.556899 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:30.556612 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" event={"ID":"05e2a1af-4924-4219-9c0f-93d12da22e3f","Type":"ContainerDied","Data":"8011f580ace9eb86d2e6a4743f18da855e26e90d923dabfa1cb096ce5485d15d"} Apr 16 16:54:36.577121 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:36.577091 2568 generic.go:358] "Generic (PLEG): container finished" podID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerID="4e5a24a5cb66a79eb1e53c104e4cb21acc3f932fa93f44d8d2d50dfdf43b849b" exitCode=0 Apr 16 16:54:36.577449 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:36.577128 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" event={"ID":"05e2a1af-4924-4219-9c0f-93d12da22e3f","Type":"ContainerDied","Data":"4e5a24a5cb66a79eb1e53c104e4cb21acc3f932fa93f44d8d2d50dfdf43b849b"} Apr 16 16:54:37.699353 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.699333 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:37.737691 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.737668 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-util\") pod \"05e2a1af-4924-4219-9c0f-93d12da22e3f\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " Apr 16 16:54:37.737800 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.737715 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-28prr\" (UniqueName: \"kubernetes.io/projected/05e2a1af-4924-4219-9c0f-93d12da22e3f-kube-api-access-28prr\") pod \"05e2a1af-4924-4219-9c0f-93d12da22e3f\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " Apr 16 16:54:37.737800 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.737785 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-bundle\") pod \"05e2a1af-4924-4219-9c0f-93d12da22e3f\" (UID: \"05e2a1af-4924-4219-9c0f-93d12da22e3f\") " Apr 16 16:54:37.738346 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.738320 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-bundle" (OuterVolumeSpecName: "bundle") pod "05e2a1af-4924-4219-9c0f-93d12da22e3f" (UID: "05e2a1af-4924-4219-9c0f-93d12da22e3f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:37.740009 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.739974 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05e2a1af-4924-4219-9c0f-93d12da22e3f-kube-api-access-28prr" (OuterVolumeSpecName: "kube-api-access-28prr") pod "05e2a1af-4924-4219-9c0f-93d12da22e3f" (UID: "05e2a1af-4924-4219-9c0f-93d12da22e3f"). InnerVolumeSpecName "kube-api-access-28prr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:54:37.742656 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.742633 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-util" (OuterVolumeSpecName: "util") pod "05e2a1af-4924-4219-9c0f-93d12da22e3f" (UID: "05e2a1af-4924-4219-9c0f-93d12da22e3f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:54:37.838469 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.838412 2568 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-bundle\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:54:37.838469 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.838441 2568 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/05e2a1af-4924-4219-9c0f-93d12da22e3f-util\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:54:37.838469 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:37.838450 2568 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-28prr\" (UniqueName: \"kubernetes.io/projected/05e2a1af-4924-4219-9c0f-93d12da22e3f-kube-api-access-28prr\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 16:54:38.586416 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:38.586384 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" event={"ID":"05e2a1af-4924-4219-9c0f-93d12da22e3f","Type":"ContainerDied","Data":"bc61d293b00aff6999898fa5c6059e10b41e39e3dd319a01cde5a8508081012d"} Apr 16 16:54:38.586416 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:38.586417 2568 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc61d293b00aff6999898fa5c6059e10b41e39e3dd319a01cde5a8508081012d" Apr 16 16:54:38.586629 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:38.586440 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29csvjv6" Apr 16 16:54:44.494467 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494433 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d"] Apr 16 16:54:44.494849 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494796 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="util" Apr 16 16:54:44.494849 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494810 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="util" Apr 16 16:54:44.494849 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494822 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="extract" Apr 16 16:54:44.494849 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494828 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="extract" Apr 16 16:54:44.494849 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494838 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="pull" Apr 16 16:54:44.494849 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494844 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="pull" Apr 16 16:54:44.495029 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.494903 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="05e2a1af-4924-4219-9c0f-93d12da22e3f" containerName="extract" Apr 16 16:54:44.547087 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.547056 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d"] Apr 16 16:54:44.547273 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.547170 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.549708 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.549682 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-6s7rc\"" Apr 16 16:54:44.549854 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.549747 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 16:54:44.549854 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.549767 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 16:54:44.549854 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.549824 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 16:54:44.584760 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.584736 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l67dd\" (UniqueName: \"kubernetes.io/projected/bc522807-a9a8-402b-b874-f8aba19483c8-kube-api-access-l67dd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9km4d\" (UID: \"bc522807-a9a8-402b-b874-f8aba19483c8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.584862 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.584770 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bc522807-a9a8-402b-b874-f8aba19483c8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9km4d\" (UID: \"bc522807-a9a8-402b-b874-f8aba19483c8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.685436 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.685412 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l67dd\" (UniqueName: \"kubernetes.io/projected/bc522807-a9a8-402b-b874-f8aba19483c8-kube-api-access-l67dd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9km4d\" (UID: \"bc522807-a9a8-402b-b874-f8aba19483c8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.685563 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.685444 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bc522807-a9a8-402b-b874-f8aba19483c8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9km4d\" (UID: \"bc522807-a9a8-402b-b874-f8aba19483c8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.687881 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.687851 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/bc522807-a9a8-402b-b874-f8aba19483c8-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9km4d\" (UID: \"bc522807-a9a8-402b-b874-f8aba19483c8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.696510 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.696486 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l67dd\" (UniqueName: \"kubernetes.io/projected/bc522807-a9a8-402b-b874-f8aba19483c8-kube-api-access-l67dd\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-9km4d\" (UID: \"bc522807-a9a8-402b-b874-f8aba19483c8\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.857438 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.857379 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:44.970310 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:44.970149 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d"] Apr 16 16:54:44.972813 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:44.972788 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbc522807_a9a8_402b_b874_f8aba19483c8.slice/crio-f64ff070cf99dea08ec5c924cedb2e8b9df1f61c935565bce8f9ef7813875871 WatchSource:0}: Error finding container f64ff070cf99dea08ec5c924cedb2e8b9df1f61c935565bce8f9ef7813875871: Status 404 returned error can't find the container with id f64ff070cf99dea08ec5c924cedb2e8b9df1f61c935565bce8f9ef7813875871 Apr 16 16:54:45.608986 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:45.608952 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" event={"ID":"bc522807-a9a8-402b-b874-f8aba19483c8","Type":"ContainerStarted","Data":"f64ff070cf99dea08ec5c924cedb2e8b9df1f61c935565bce8f9ef7813875871"} Apr 16 16:54:48.621297 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.621222 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" event={"ID":"bc522807-a9a8-402b-b874-f8aba19483c8","Type":"ContainerStarted","Data":"d26a5a5a41116d93cdbecf01b925505c54ddaf84e50af9533ede2fe74d922cc4"} Apr 16 16:54:48.621297 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.621278 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:54:48.640538 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.640491 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" podStartSLOduration=1.288832459 podStartE2EDuration="4.640477699s" podCreationTimestamp="2026-04-16 16:54:44 +0000 UTC" firstStartedPulling="2026-04-16 16:54:44.974424084 +0000 UTC m=+418.183196373" lastFinishedPulling="2026-04-16 16:54:48.32606931 +0000 UTC m=+421.534841613" observedRunningTime="2026-04-16 16:54:48.638526306 +0000 UTC m=+421.847298617" watchObservedRunningTime="2026-04-16 16:54:48.640477699 +0000 UTC m=+421.849250009" Apr 16 16:54:48.889630 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.889548 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-8dzkh"] Apr 16 16:54:48.892812 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.892796 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:48.895159 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.895129 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 16 16:54:48.895290 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.895167 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 16:54:48.895290 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.895188 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-fxzm6\"" Apr 16 16:54:48.901524 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:48.901499 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-8dzkh"] Apr 16 16:54:49.022699 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.022674 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.022835 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.022722 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xcbw\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-kube-api-access-2xcbw\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.022835 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.022810 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/192654b0-79b3-49f7-8228-72cbae39eaf7-cabundle0\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.123954 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.123921 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.124081 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.123959 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2xcbw\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-kube-api-access-2xcbw\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.124081 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.123981 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/192654b0-79b3-49f7-8228-72cbae39eaf7-cabundle0\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.124163 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.124078 2568 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 16 16:54:49.124163 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.124100 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:54:49.124163 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.124109 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:54:49.124163 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.124126 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-8dzkh: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 16:54:49.124299 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.124187 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates podName:192654b0-79b3-49f7-8228-72cbae39eaf7 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:49.624169244 +0000 UTC m=+422.832941535 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates") pod "keda-operator-ffbb595cb-8dzkh" (UID: "192654b0-79b3-49f7-8228-72cbae39eaf7") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 16 16:54:49.124518 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.124502 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/192654b0-79b3-49f7-8228-72cbae39eaf7-cabundle0\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.139340 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.139318 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xcbw\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-kube-api-access-2xcbw\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.157471 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.157420 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9"] Apr 16 16:54:49.160734 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.160720 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.165157 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.165139 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 16:54:49.175872 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.175851 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9"] Apr 16 16:54:49.224398 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.224374 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lp26c\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-kube-api-access-lp26c\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.224495 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.224428 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.224538 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.224496 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a8b62ee4-931f-413a-9a95-c98f84c49c37-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.324968 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.324922 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a8b62ee4-931f-413a-9a95-c98f84c49c37-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.324968 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.324970 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lp26c\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-kube-api-access-lp26c\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.325157 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.325015 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.325157 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.325127 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:54:49.325157 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.325141 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:54:49.325157 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.325159 2568 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 16:54:49.325292 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.325176 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 16:54:49.325292 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.325233 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates podName:a8b62ee4-931f-413a-9a95-c98f84c49c37 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:49.825216336 +0000 UTC m=+423.033988626 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates") pod "keda-metrics-apiserver-7c9f485588-hzsm9" (UID: "a8b62ee4-931f-413a-9a95-c98f84c49c37") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 16:54:49.325292 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.325259 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/a8b62ee4-931f-413a-9a95-c98f84c49c37-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.345710 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.345684 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lp26c\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-kube-api-access-lp26c\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.397341 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.397311 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-lsdxw"] Apr 16 16:54:49.400533 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.400519 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:49.404465 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.404448 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 16 16:54:49.424907 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.424857 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lsdxw"] Apr 16 16:54:49.526943 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.526912 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x86tg\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-kube-api-access-x86tg\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:49.526943 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.526947 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-certificates\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:49.628310 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.628277 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:49.628310 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.628311 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x86tg\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-kube-api-access-x86tg\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.628336 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-certificates\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628425 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628444 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628450 2568 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628455 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-8dzkh: references non-existent secret key: ca.crt Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628467 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-lsdxw: secret "keda-admission-webhooks-certs" not found Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628511 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates podName:192654b0-79b3-49f7-8228-72cbae39eaf7 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:50.628497238 +0000 UTC m=+423.837269539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates") pod "keda-operator-ffbb595cb-8dzkh" (UID: "192654b0-79b3-49f7-8228-72cbae39eaf7") : references non-existent secret key: ca.crt Apr 16 16:54:49.628750 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.628527 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-certificates podName:e87f3210-3a46-4996-982c-5af9273339da nodeName:}" failed. No retries permitted until 2026-04-16 16:54:50.128521675 +0000 UTC m=+423.337293970 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-certificates") pod "keda-admission-cf49989db-lsdxw" (UID: "e87f3210-3a46-4996-982c-5af9273339da") : secret "keda-admission-webhooks-certs" not found Apr 16 16:54:49.641917 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.641892 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x86tg\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-kube-api-access-x86tg\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:49.830614 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:49.830560 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:49.830774 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.830697 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:54:49.830774 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.830713 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:54:49.830774 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.830731 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9: references non-existent secret key: tls.crt Apr 16 16:54:49.830882 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:49.830781 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates podName:a8b62ee4-931f-413a-9a95-c98f84c49c37 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:50.830768015 +0000 UTC m=+424.039540308 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates") pod "keda-metrics-apiserver-7c9f485588-hzsm9" (UID: "a8b62ee4-931f-413a-9a95-c98f84c49c37") : references non-existent secret key: tls.crt Apr 16 16:54:50.133101 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.133020 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-certificates\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:50.135589 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.135565 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/e87f3210-3a46-4996-982c-5af9273339da-certificates\") pod \"keda-admission-cf49989db-lsdxw\" (UID: \"e87f3210-3a46-4996-982c-5af9273339da\") " pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:50.310276 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.310247 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:54:50.449322 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.449292 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-lsdxw"] Apr 16 16:54:50.452378 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:50.452347 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87f3210_3a46_4996_982c_5af9273339da.slice/crio-2f6fe264f49dfacc40116562ea6d97cf58f48474a3fe43bef2dbd977116e6d6b WatchSource:0}: Error finding container 2f6fe264f49dfacc40116562ea6d97cf58f48474a3fe43bef2dbd977116e6d6b: Status 404 returned error can't find the container with id 2f6fe264f49dfacc40116562ea6d97cf58f48474a3fe43bef2dbd977116e6d6b Apr 16 16:54:50.634582 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.634508 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lsdxw" event={"ID":"e87f3210-3a46-4996-982c-5af9273339da","Type":"ContainerStarted","Data":"2f6fe264f49dfacc40116562ea6d97cf58f48474a3fe43bef2dbd977116e6d6b"} Apr 16 16:54:50.637889 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.637871 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:50.637992 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.637981 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:54:50.638039 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.637995 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:54:50.638039 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.638005 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-8dzkh: references non-existent secret key: ca.crt Apr 16 16:54:50.638101 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.638054 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates podName:192654b0-79b3-49f7-8228-72cbae39eaf7 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:52.638041419 +0000 UTC m=+425.846813708 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates") pod "keda-operator-ffbb595cb-8dzkh" (UID: "192654b0-79b3-49f7-8228-72cbae39eaf7") : references non-existent secret key: ca.crt Apr 16 16:54:50.840531 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:50.840479 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:50.840705 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.840631 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:54:50.840705 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.840650 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:54:50.840705 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.840671 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9: references non-existent secret key: tls.crt Apr 16 16:54:50.840809 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:50.840732 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates podName:a8b62ee4-931f-413a-9a95-c98f84c49c37 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:52.840718733 +0000 UTC m=+426.049491022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates") pod "keda-metrics-apiserver-7c9f485588-hzsm9" (UID: "a8b62ee4-931f-413a-9a95-c98f84c49c37") : references non-existent secret key: tls.crt Apr 16 16:54:52.655655 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:52.655622 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:52.655996 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.655692 2568 secret.go:281] references non-existent secret key: ca.crt Apr 16 16:54:52.655996 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.655709 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 16 16:54:52.655996 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.655719 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-8dzkh: references non-existent secret key: ca.crt Apr 16 16:54:52.655996 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.655765 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates podName:192654b0-79b3-49f7-8228-72cbae39eaf7 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:56.655752078 +0000 UTC m=+429.864524371 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates") pod "keda-operator-ffbb595cb-8dzkh" (UID: "192654b0-79b3-49f7-8228-72cbae39eaf7") : references non-existent secret key: ca.crt Apr 16 16:54:52.857275 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:52.857237 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:52.857439 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.857352 2568 secret.go:281] references non-existent secret key: tls.crt Apr 16 16:54:52.857439 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.857363 2568 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 16:54:52.857439 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.857381 2568 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9: references non-existent secret key: tls.crt Apr 16 16:54:52.857439 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:54:52.857434 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates podName:a8b62ee4-931f-413a-9a95-c98f84c49c37 nodeName:}" failed. No retries permitted until 2026-04-16 16:54:56.857422421 +0000 UTC m=+430.066194710 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates") pod "keda-metrics-apiserver-7c9f485588-hzsm9" (UID: "a8b62ee4-931f-413a-9a95-c98f84c49c37") : references non-existent secret key: tls.crt Apr 16 16:54:56.684862 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.684828 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:56.687146 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.687128 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/192654b0-79b3-49f7-8228-72cbae39eaf7-certificates\") pod \"keda-operator-ffbb595cb-8dzkh\" (UID: \"192654b0-79b3-49f7-8228-72cbae39eaf7\") " pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:56.703003 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.702976 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:54:56.815124 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.815098 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-8dzkh"] Apr 16 16:54:56.817680 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:56.817654 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod192654b0_79b3_49f7_8228_72cbae39eaf7.slice/crio-cb07a7df3cf0a812bf567b4e1c6f72ed68d07b768b34ab9e87da17f4f718890a WatchSource:0}: Error finding container cb07a7df3cf0a812bf567b4e1c6f72ed68d07b768b34ab9e87da17f4f718890a: Status 404 returned error can't find the container with id cb07a7df3cf0a812bf567b4e1c6f72ed68d07b768b34ab9e87da17f4f718890a Apr 16 16:54:56.887124 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.887093 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:56.889374 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.889351 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/a8b62ee4-931f-413a-9a95-c98f84c49c37-certificates\") pod \"keda-metrics-apiserver-7c9f485588-hzsm9\" (UID: \"a8b62ee4-931f-413a-9a95-c98f84c49c37\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:56.971946 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:56.971917 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:54:57.106160 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:57.106114 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9"] Apr 16 16:54:57.109238 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:54:57.109213 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8b62ee4_931f_413a_9a95_c98f84c49c37.slice/crio-c8c7824e7bdcb24682dc895da4abe220a9ff03feebcf7c3784f7a89be7ce8f2f WatchSource:0}: Error finding container c8c7824e7bdcb24682dc895da4abe220a9ff03feebcf7c3784f7a89be7ce8f2f: Status 404 returned error can't find the container with id c8c7824e7bdcb24682dc895da4abe220a9ff03feebcf7c3784f7a89be7ce8f2f Apr 16 16:54:57.654983 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:57.654951 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" event={"ID":"192654b0-79b3-49f7-8228-72cbae39eaf7","Type":"ContainerStarted","Data":"cb07a7df3cf0a812bf567b4e1c6f72ed68d07b768b34ab9e87da17f4f718890a"} Apr 16 16:54:57.655911 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:54:57.655891 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" event={"ID":"a8b62ee4-931f-413a-9a95-c98f84c49c37","Type":"ContainerStarted","Data":"c8c7824e7bdcb24682dc895da4abe220a9ff03feebcf7c3784f7a89be7ce8f2f"} Apr 16 16:55:00.671278 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:00.671247 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" event={"ID":"192654b0-79b3-49f7-8228-72cbae39eaf7","Type":"ContainerStarted","Data":"0ead9799dd477b68905c441ae90e8e95f0b2f537d417cd12fc91d338236b7a6a"} Apr 16 16:55:00.671664 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:00.671341 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:55:00.673283 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:00.673244 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" event={"ID":"a8b62ee4-931f-413a-9a95-c98f84c49c37","Type":"ContainerStarted","Data":"6bf92fe2c009813bf2c8816dd61c864c0ea0e0e3e0f201852b1906072806da4e"} Apr 16 16:55:00.673438 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:00.673415 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:55:00.689464 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:00.689394 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" podStartSLOduration=8.922807157 podStartE2EDuration="12.689378761s" podCreationTimestamp="2026-04-16 16:54:48 +0000 UTC" firstStartedPulling="2026-04-16 16:54:56.819312519 +0000 UTC m=+430.028084808" lastFinishedPulling="2026-04-16 16:55:00.585884123 +0000 UTC m=+433.794656412" observedRunningTime="2026-04-16 16:55:00.686291626 +0000 UTC m=+433.895063963" watchObservedRunningTime="2026-04-16 16:55:00.689378761 +0000 UTC m=+433.898151071" Apr 16 16:55:00.703779 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:00.703333 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" podStartSLOduration=8.23303637 podStartE2EDuration="11.703317218s" podCreationTimestamp="2026-04-16 16:54:49 +0000 UTC" firstStartedPulling="2026-04-16 16:54:57.11049943 +0000 UTC m=+430.319271722" lastFinishedPulling="2026-04-16 16:55:00.58078027 +0000 UTC m=+433.789552570" observedRunningTime="2026-04-16 16:55:00.702932196 +0000 UTC m=+433.911704520" watchObservedRunningTime="2026-04-16 16:55:00.703317218 +0000 UTC m=+433.912089532" Apr 16 16:55:01.678361 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:01.678275 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-lsdxw" event={"ID":"e87f3210-3a46-4996-982c-5af9273339da","Type":"ContainerStarted","Data":"23790cc36364bed5e87cb9e45a8016d6cba6aa54e26c214717498010e299362a"} Apr 16 16:55:01.696275 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:01.696230 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-lsdxw" podStartSLOduration=1.8175939429999999 podStartE2EDuration="12.696217391s" podCreationTimestamp="2026-04-16 16:54:49 +0000 UTC" firstStartedPulling="2026-04-16 16:54:50.453532599 +0000 UTC m=+423.662304889" lastFinishedPulling="2026-04-16 16:55:01.332156049 +0000 UTC m=+434.540928337" observedRunningTime="2026-04-16 16:55:01.695552505 +0000 UTC m=+434.904324828" watchObservedRunningTime="2026-04-16 16:55:01.696217391 +0000 UTC m=+434.904989703" Apr 16 16:55:02.680999 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:02.680968 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:55:09.627047 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:09.626971 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-9km4d" Apr 16 16:55:11.683107 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:11.683079 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-hzsm9" Apr 16 16:55:21.680830 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:21.680796 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-8dzkh" Apr 16 16:55:23.687474 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:23.687437 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-lsdxw" Apr 16 16:55:55.502150 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.502116 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm"] Apr 16 16:55:55.505484 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.505461 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.507907 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.507880 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:55:55.507907 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.507895 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:55:55.508051 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.507929 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:55:55.508629 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.508592 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-bqkn2\"" Apr 16 16:55:55.515780 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.515760 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm"] Apr 16 16:55:55.551845 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.551817 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eb1c77a-a5aa-4112-bc53-ecb62776005c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vcvdm\" (UID: \"0eb1c77a-a5aa-4112-bc53-ecb62776005c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.551974 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.551848 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9pzk\" (UniqueName: \"kubernetes.io/projected/0eb1c77a-a5aa-4112-bc53-ecb62776005c-kube-api-access-d9pzk\") pod \"llmisvc-controller-manager-68cc5db7c4-vcvdm\" (UID: \"0eb1c77a-a5aa-4112-bc53-ecb62776005c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.652932 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.652899 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eb1c77a-a5aa-4112-bc53-ecb62776005c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vcvdm\" (UID: \"0eb1c77a-a5aa-4112-bc53-ecb62776005c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.653089 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.652936 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9pzk\" (UniqueName: \"kubernetes.io/projected/0eb1c77a-a5aa-4112-bc53-ecb62776005c-kube-api-access-d9pzk\") pod \"llmisvc-controller-manager-68cc5db7c4-vcvdm\" (UID: \"0eb1c77a-a5aa-4112-bc53-ecb62776005c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.655204 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.655185 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0eb1c77a-a5aa-4112-bc53-ecb62776005c-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-vcvdm\" (UID: \"0eb1c77a-a5aa-4112-bc53-ecb62776005c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.661181 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.661153 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9pzk\" (UniqueName: \"kubernetes.io/projected/0eb1c77a-a5aa-4112-bc53-ecb62776005c-kube-api-access-d9pzk\") pod \"llmisvc-controller-manager-68cc5db7c4-vcvdm\" (UID: \"0eb1c77a-a5aa-4112-bc53-ecb62776005c\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.816292 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.816212 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:55.932494 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:55.932424 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm"] Apr 16 16:55:55.934930 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:55:55.934903 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0eb1c77a_a5aa_4112_bc53_ecb62776005c.slice/crio-7acb0ea54e700c16c7fcf09f527ce06d8963060824a32f6231c1f1959997945b WatchSource:0}: Error finding container 7acb0ea54e700c16c7fcf09f527ce06d8963060824a32f6231c1f1959997945b: Status 404 returned error can't find the container with id 7acb0ea54e700c16c7fcf09f527ce06d8963060824a32f6231c1f1959997945b Apr 16 16:55:56.850348 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:56.850307 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" event={"ID":"0eb1c77a-a5aa-4112-bc53-ecb62776005c","Type":"ContainerStarted","Data":"7acb0ea54e700c16c7fcf09f527ce06d8963060824a32f6231c1f1959997945b"} Apr 16 16:55:58.858079 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:58.858039 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" event={"ID":"0eb1c77a-a5aa-4112-bc53-ecb62776005c","Type":"ContainerStarted","Data":"a6934f9dc94767e5b54ebd16269d6a1483fc9c737cbc6104c5229391b2777220"} Apr 16 16:55:58.858454 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:58.858275 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:55:58.874231 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:55:58.874186 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" podStartSLOduration=2.017712662 podStartE2EDuration="3.874171726s" podCreationTimestamp="2026-04-16 16:55:55 +0000 UTC" firstStartedPulling="2026-04-16 16:55:55.936182895 +0000 UTC m=+489.144955185" lastFinishedPulling="2026-04-16 16:55:57.792641947 +0000 UTC m=+491.001414249" observedRunningTime="2026-04-16 16:55:58.872879514 +0000 UTC m=+492.081651819" watchObservedRunningTime="2026-04-16 16:55:58.874171726 +0000 UTC m=+492.082944036" Apr 16 16:56:29.863306 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:56:29.863279 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-vcvdm" Apr 16 16:57:04.207825 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.207786 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-lfqdm"] Apr 16 16:57:04.211321 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.211303 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.223094 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.223070 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 16:57:04.223868 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.223436 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-6ct8d\"" Apr 16 16:57:04.226207 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.226186 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-lfqdm"] Apr 16 16:57:04.255116 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.255086 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed462eb3-83fa-424e-bdb9-39a75cc1c267-tls-certs\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.255252 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.255197 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74kxx\" (UniqueName: \"kubernetes.io/projected/ed462eb3-83fa-424e-bdb9-39a75cc1c267-kube-api-access-74kxx\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.356097 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.356070 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed462eb3-83fa-424e-bdb9-39a75cc1c267-tls-certs\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.356224 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:57:04.356114 2568 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 16 16:57:04.356224 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.356161 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-74kxx\" (UniqueName: \"kubernetes.io/projected/ed462eb3-83fa-424e-bdb9-39a75cc1c267-kube-api-access-74kxx\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.356224 ip-10-0-128-130 kubenswrapper[2568]: E0416 16:57:04.356182 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed462eb3-83fa-424e-bdb9-39a75cc1c267-tls-certs podName:ed462eb3-83fa-424e-bdb9-39a75cc1c267 nodeName:}" failed. No retries permitted until 2026-04-16 16:57:04.856163723 +0000 UTC m=+558.064936015 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/ed462eb3-83fa-424e-bdb9-39a75cc1c267-tls-certs") pod "model-serving-api-86f7b4b499-lfqdm" (UID: "ed462eb3-83fa-424e-bdb9-39a75cc1c267") : secret "model-serving-api-tls" not found Apr 16 16:57:04.366401 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.366378 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-74kxx\" (UniqueName: \"kubernetes.io/projected/ed462eb3-83fa-424e-bdb9-39a75cc1c267-kube-api-access-74kxx\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.861045 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.861014 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed462eb3-83fa-424e-bdb9-39a75cc1c267-tls-certs\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:04.863390 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:04.863359 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ed462eb3-83fa-424e-bdb9-39a75cc1c267-tls-certs\") pod \"model-serving-api-86f7b4b499-lfqdm\" (UID: \"ed462eb3-83fa-424e-bdb9-39a75cc1c267\") " pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:05.135631 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:05.135544 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:05.251105 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:05.251075 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-lfqdm"] Apr 16 16:57:05.254037 ip-10-0-128-130 kubenswrapper[2568]: W0416 16:57:05.254005 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded462eb3_83fa_424e_bdb9_39a75cc1c267.slice/crio-30fcaf2804fc6b9b5bc4b725ab4053901c1ba8e176ca2319e4a865f2fbf8425c WatchSource:0}: Error finding container 30fcaf2804fc6b9b5bc4b725ab4053901c1ba8e176ca2319e4a865f2fbf8425c: Status 404 returned error can't find the container with id 30fcaf2804fc6b9b5bc4b725ab4053901c1ba8e176ca2319e4a865f2fbf8425c Apr 16 16:57:06.071430 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:06.071385 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-lfqdm" event={"ID":"ed462eb3-83fa-424e-bdb9-39a75cc1c267","Type":"ContainerStarted","Data":"30fcaf2804fc6b9b5bc4b725ab4053901c1ba8e176ca2319e4a865f2fbf8425c"} Apr 16 16:57:08.084611 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:08.084574 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-lfqdm" event={"ID":"ed462eb3-83fa-424e-bdb9-39a75cc1c267","Type":"ContainerStarted","Data":"96ded2c16be4898b0b9d24a7df18ceb7d4a0cef1b151b095ee3e50a71f92a0f1"} Apr 16 16:57:08.100350 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:08.100303 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-lfqdm" podStartSLOduration=1.864006367 podStartE2EDuration="4.100292005s" podCreationTimestamp="2026-04-16 16:57:04 +0000 UTC" firstStartedPulling="2026-04-16 16:57:05.255766541 +0000 UTC m=+558.464538834" lastFinishedPulling="2026-04-16 16:57:07.492052179 +0000 UTC m=+560.700824472" observedRunningTime="2026-04-16 16:57:08.099028853 +0000 UTC m=+561.307801165" watchObservedRunningTime="2026-04-16 16:57:08.100292005 +0000 UTC m=+561.309064315" Apr 16 16:57:09.087923 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:09.087890 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:20.094154 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:20.094125 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-lfqdm" Apr 16 16:57:47.274021 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:47.273984 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 16:57:47.275613 ip-10-0-128-130 kubenswrapper[2568]: I0416 16:57:47.275573 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 17:00:39.789114 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.789080 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn"] Apr 16 17:00:39.792028 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.792009 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:39.794312 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.794290 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-231d1-serving-cert\"" Apr 16 17:00:39.794423 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.794337 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-231d1-kube-rbac-proxy-sar-config\"" Apr 16 17:00:39.794423 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.794350 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:00:39.794423 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.794375 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bgqcs\"" Apr 16 17:00:39.799686 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.799667 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn"] Apr 16 17:00:39.893685 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.893654 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:39.893832 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.893729 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53564292-574b-468b-9b6d-e4b43ee10ca7-openshift-service-ca-bundle\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:39.994668 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.994629 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53564292-574b-468b-9b6d-e4b43ee10ca7-openshift-service-ca-bundle\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:39.994828 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.994792 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:39.994946 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:00:39.994930 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-231d1-serving-cert: secret "model-chainer-raw-231d1-serving-cert" not found Apr 16 17:00:39.995022 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:00:39.995011 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls podName:53564292-574b-468b-9b6d-e4b43ee10ca7 nodeName:}" failed. No retries permitted until 2026-04-16 17:00:40.494989599 +0000 UTC m=+773.703761892 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls") pod "model-chainer-raw-231d1-5878d8457d-nxjqn" (UID: "53564292-574b-468b-9b6d-e4b43ee10ca7") : secret "model-chainer-raw-231d1-serving-cert" not found Apr 16 17:00:39.995281 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:39.995262 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53564292-574b-468b-9b6d-e4b43ee10ca7-openshift-service-ca-bundle\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:40.500521 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:40.500491 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:40.502898 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:40.502878 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls\") pod \"model-chainer-raw-231d1-5878d8457d-nxjqn\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:40.702943 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:40.702905 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:40.824532 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:40.821276 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn"] Apr 16 17:00:40.826136 ip-10-0-128-130 kubenswrapper[2568]: W0416 17:00:40.826098 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53564292_574b_468b_9b6d_e4b43ee10ca7.slice/crio-e89f015e2ac143e6addbbdd087fb03b3d22da89ece91624c624c870152e0143b WatchSource:0}: Error finding container e89f015e2ac143e6addbbdd087fb03b3d22da89ece91624c624c870152e0143b: Status 404 returned error can't find the container with id e89f015e2ac143e6addbbdd087fb03b3d22da89ece91624c624c870152e0143b Apr 16 17:00:40.828024 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:40.827995 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:00:41.743473 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:41.743434 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" event={"ID":"53564292-574b-468b-9b6d-e4b43ee10ca7","Type":"ContainerStarted","Data":"e89f015e2ac143e6addbbdd087fb03b3d22da89ece91624c624c870152e0143b"} Apr 16 17:00:43.752025 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:43.751988 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" event={"ID":"53564292-574b-468b-9b6d-e4b43ee10ca7","Type":"ContainerStarted","Data":"cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c"} Apr 16 17:00:43.752492 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:43.752170 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:43.766652 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:43.766582 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podStartSLOduration=2.504531562 podStartE2EDuration="4.766567606s" podCreationTimestamp="2026-04-16 17:00:39 +0000 UTC" firstStartedPulling="2026-04-16 17:00:40.828152109 +0000 UTC m=+774.036924397" lastFinishedPulling="2026-04-16 17:00:43.090188151 +0000 UTC m=+776.298960441" observedRunningTime="2026-04-16 17:00:43.764867492 +0000 UTC m=+776.973639804" watchObservedRunningTime="2026-04-16 17:00:43.766567606 +0000 UTC m=+776.975339917" Apr 16 17:00:49.761508 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:49.761481 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:00:49.882342 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:49.882308 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn"] Apr 16 17:00:49.882546 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:49.882508 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" containerID="cri-o://cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c" gracePeriod=30 Apr 16 17:00:54.759760 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:54.759698 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:00:59.759303 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:00:59.759268 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:01:04.759838 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:04.759801 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:01:04.760215 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:04.759913 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:01:09.760491 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:09.760401 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:01:14.759978 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:14.759942 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:01:19.759693 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:19.759656 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:01:20.020855 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.020799 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:01:20.097484 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.097456 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls\") pod \"53564292-574b-468b-9b6d-e4b43ee10ca7\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " Apr 16 17:01:20.097484 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.097485 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53564292-574b-468b-9b6d-e4b43ee10ca7-openshift-service-ca-bundle\") pod \"53564292-574b-468b-9b6d-e4b43ee10ca7\" (UID: \"53564292-574b-468b-9b6d-e4b43ee10ca7\") " Apr 16 17:01:20.097892 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.097872 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53564292-574b-468b-9b6d-e4b43ee10ca7-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "53564292-574b-468b-9b6d-e4b43ee10ca7" (UID: "53564292-574b-468b-9b6d-e4b43ee10ca7"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:01:20.099535 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.099511 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "53564292-574b-468b-9b6d-e4b43ee10ca7" (UID: "53564292-574b-468b-9b6d-e4b43ee10ca7"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:01:20.198085 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.198048 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/53564292-574b-468b-9b6d-e4b43ee10ca7-proxy-tls\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 17:01:20.198085 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.198082 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/53564292-574b-468b-9b6d-e4b43ee10ca7-openshift-service-ca-bundle\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 17:01:20.870439 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.870403 2568 generic.go:358] "Generic (PLEG): container finished" podID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerID="cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c" exitCode=0 Apr 16 17:01:20.870898 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.870476 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" Apr 16 17:01:20.870898 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.870484 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" event={"ID":"53564292-574b-468b-9b6d-e4b43ee10ca7","Type":"ContainerDied","Data":"cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c"} Apr 16 17:01:20.870898 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.870520 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn" event={"ID":"53564292-574b-468b-9b6d-e4b43ee10ca7","Type":"ContainerDied","Data":"e89f015e2ac143e6addbbdd087fb03b3d22da89ece91624c624c870152e0143b"} Apr 16 17:01:20.870898 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.870534 2568 scope.go:117] "RemoveContainer" containerID="cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c" Apr 16 17:01:20.878953 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.878925 2568 scope.go:117] "RemoveContainer" containerID="cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c" Apr 16 17:01:20.879228 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:01:20.879192 2568 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c\": container with ID starting with cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c not found: ID does not exist" containerID="cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c" Apr 16 17:01:20.879280 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.879248 2568 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c"} err="failed to get container status \"cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c\": rpc error: code = NotFound desc = could not find container \"cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c\": container with ID starting with cca0ec9013d2adba51177e6f7a9d86ca6a8c8fef23e517a14220904614efbf2c not found: ID does not exist" Apr 16 17:01:20.889980 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.889935 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn"] Apr 16 17:01:20.892337 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:20.892317 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-231d1-5878d8457d-nxjqn"] Apr 16 17:01:21.358087 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:01:21.358061 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" path="/var/lib/kubelet/pods/53564292-574b-468b-9b6d-e4b43ee10ca7/volumes" Apr 16 17:02:20.093083 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.093051 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22"] Apr 16 17:02:20.093552 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.093383 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" Apr 16 17:02:20.093552 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.093394 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" Apr 16 17:02:20.093552 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.093448 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="53564292-574b-468b-9b6d-e4b43ee10ca7" containerName="model-chainer-raw-231d1" Apr 16 17:02:20.095310 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.095290 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.097915 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.097885 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4ab2c-kube-rbac-proxy-sar-config\"" Apr 16 17:02:20.097915 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.097905 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 17:02:20.098117 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.097923 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bgqcs\"" Apr 16 17:02:20.098164 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.098139 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-4ab2c-serving-cert\"" Apr 16 17:02:20.105566 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.105546 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22"] Apr 16 17:02:20.117513 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.117493 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/573278f1-6e5f-4c4c-b915-080b1194d410-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.117626 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.117535 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.218103 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.218069 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.218258 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.218141 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/573278f1-6e5f-4c4c-b915-080b1194d410-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.218258 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:02:20.218203 2568 secret.go:189] Couldn't get secret kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-serving-cert: secret "model-chainer-raw-hpa-4ab2c-serving-cert" not found Apr 16 17:02:20.218343 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:02:20.218278 2568 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls podName:573278f1-6e5f-4c4c-b915-080b1194d410 nodeName:}" failed. No retries permitted until 2026-04-16 17:02:20.718261946 +0000 UTC m=+873.927034234 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls") pod "model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" (UID: "573278f1-6e5f-4c4c-b915-080b1194d410") : secret "model-chainer-raw-hpa-4ab2c-serving-cert" not found Apr 16 17:02:20.218730 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.218713 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/573278f1-6e5f-4c4c-b915-080b1194d410-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.721925 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.721891 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:20.724113 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:20.724093 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls\") pod \"model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:21.005502 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:21.005418 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:21.324889 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:21.324864 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22"] Apr 16 17:02:21.326783 ip-10-0-128-130 kubenswrapper[2568]: W0416 17:02:21.326758 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-1d2cceecad7735a6c838e61b1350bcd48017b690722a1bb8fcfe3de3ecdf0577 WatchSource:0}: Error finding container 1d2cceecad7735a6c838e61b1350bcd48017b690722a1bb8fcfe3de3ecdf0577: Status 404 returned error can't find the container with id 1d2cceecad7735a6c838e61b1350bcd48017b690722a1bb8fcfe3de3ecdf0577 Apr 16 17:02:22.057298 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:22.057267 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" event={"ID":"573278f1-6e5f-4c4c-b915-080b1194d410","Type":"ContainerStarted","Data":"23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1"} Apr 16 17:02:22.057298 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:22.057301 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" event={"ID":"573278f1-6e5f-4c4c-b915-080b1194d410","Type":"ContainerStarted","Data":"1d2cceecad7735a6c838e61b1350bcd48017b690722a1bb8fcfe3de3ecdf0577"} Apr 16 17:02:22.057510 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:22.057424 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:22.076050 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:22.076010 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podStartSLOduration=2.075998908 podStartE2EDuration="2.075998908s" podCreationTimestamp="2026-04-16 17:02:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:02:22.073365503 +0000 UTC m=+875.282137839" watchObservedRunningTime="2026-04-16 17:02:22.075998908 +0000 UTC m=+875.284771218" Apr 16 17:02:28.065762 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:28.065733 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:30.143879 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:30.143842 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22"] Apr 16 17:02:30.144256 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:30.144121 2568 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" containerID="cri-o://23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1" gracePeriod=30 Apr 16 17:02:33.063948 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:33.063915 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:38.063981 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:38.063943 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:43.063805 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:43.063769 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:43.064164 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:43.063873 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:02:47.296225 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:47.296196 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 17:02:47.298970 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:47.298946 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 17:02:48.063805 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:48.063772 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:53.064181 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:53.064145 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:02:58.064513 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:02:58.064475 2568 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 17:03:00.166544 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:03:00.166423 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-conmon-23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:03:00.166907 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:03:00.166707 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:03:00.166964 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:03:00.166910 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-conmon-23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:03:00.173250 ip-10-0-128-130 kubenswrapper[2568]: E0416 17:03:00.167110 2568 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-1d2cceecad7735a6c838e61b1350bcd48017b690722a1bb8fcfe3de3ecdf0577\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod573278f1_6e5f_4c4c_b915_080b1194d410.slice/crio-conmon-23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1.scope\": RecentStats: unable to find data in memory cache]" Apr 16 17:03:00.180626 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.180583 2568 generic.go:358] "Generic (PLEG): container finished" podID="573278f1-6e5f-4c4c-b915-080b1194d410" containerID="23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1" exitCode=0 Apr 16 17:03:00.180739 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.180623 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" event={"ID":"573278f1-6e5f-4c4c-b915-080b1194d410","Type":"ContainerDied","Data":"23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1"} Apr 16 17:03:00.285407 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.285378 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:03:00.424459 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.424382 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls\") pod \"573278f1-6e5f-4c4c-b915-080b1194d410\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " Apr 16 17:03:00.424618 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.424480 2568 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/573278f1-6e5f-4c4c-b915-080b1194d410-openshift-service-ca-bundle\") pod \"573278f1-6e5f-4c4c-b915-080b1194d410\" (UID: \"573278f1-6e5f-4c4c-b915-080b1194d410\") " Apr 16 17:03:00.424905 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.424874 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/573278f1-6e5f-4c4c-b915-080b1194d410-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "573278f1-6e5f-4c4c-b915-080b1194d410" (UID: "573278f1-6e5f-4c4c-b915-080b1194d410"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 17:03:00.425039 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.424958 2568 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/573278f1-6e5f-4c4c-b915-080b1194d410-openshift-service-ca-bundle\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 17:03:00.426517 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.426492 2568 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "573278f1-6e5f-4c4c-b915-080b1194d410" (UID: "573278f1-6e5f-4c4c-b915-080b1194d410"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 17:03:00.525437 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:00.525411 2568 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/573278f1-6e5f-4c4c-b915-080b1194d410-proxy-tls\") on node \"ip-10-0-128-130.ec2.internal\" DevicePath \"\"" Apr 16 17:03:01.185725 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:01.185689 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" event={"ID":"573278f1-6e5f-4c4c-b915-080b1194d410","Type":"ContainerDied","Data":"1d2cceecad7735a6c838e61b1350bcd48017b690722a1bb8fcfe3de3ecdf0577"} Apr 16 17:03:01.185725 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:01.185722 2568 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22" Apr 16 17:03:01.186185 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:01.185738 2568 scope.go:117] "RemoveContainer" containerID="23bc3304f50a5c36e33031593ee37e36213e9a7fe02f5f16475bc049d0f1f9b1" Apr 16 17:03:01.206443 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:01.206418 2568 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22"] Apr 16 17:03:01.210114 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:01.210093 2568 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-4ab2c-7f5758d7db-jtm22"] Apr 16 17:03:01.358354 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:03:01.358321 2568 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" path="/var/lib/kubelet/pods/573278f1-6e5f-4c4c-b915-080b1194d410/volumes" Apr 16 17:07:47.320968 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:07:47.320929 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 17:07:47.326061 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:07:47.326039 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 17:10:58.597561 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:10:58.597530 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-8vlsd_be58460c-a5db-43bc-bafb-93a72415c5ea/global-pull-secret-syncer/0.log" Apr 16 17:10:58.697294 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:10:58.697265 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-srdnj_42ba454f-7aed-4795-a3c0-cbc87b83def8/konnectivity-agent/0.log" Apr 16 17:10:58.741885 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:10:58.741856 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-128-130.ec2.internal_e3a1a534b3b465bad97fd00e274467c0/haproxy/0.log" Apr 16 17:11:02.094276 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.094248 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/alertmanager/0.log" Apr 16 17:11:02.125172 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.125145 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/config-reloader/0.log" Apr 16 17:11:02.154015 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.153943 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/kube-rbac-proxy-web/0.log" Apr 16 17:11:02.180562 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.180543 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/kube-rbac-proxy/0.log" Apr 16 17:11:02.207341 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.207324 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/kube-rbac-proxy-metric/0.log" Apr 16 17:11:02.237114 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.237098 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/prom-label-proxy/0.log" Apr 16 17:11:02.267781 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.267762 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_a1ab25dd-1a13-4c31-8f4f-4f9b5c5ccd64/init-config-reloader/0.log" Apr 16 17:11:02.340709 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.340691 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-2gdfq_7e647148-eee0-4e1c-a69d-27b49175a4fb/kube-state-metrics/0.log" Apr 16 17:11:02.364831 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.364802 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-2gdfq_7e647148-eee0-4e1c-a69d-27b49175a4fb/kube-rbac-proxy-main/0.log" Apr 16 17:11:02.387069 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.387051 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-2gdfq_7e647148-eee0-4e1c-a69d-27b49175a4fb/kube-rbac-proxy-self/0.log" Apr 16 17:11:02.664513 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.664489 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xmbwz_f8fc4c1b-8a9b-474a-ab6d-391214553bf2/node-exporter/0.log" Apr 16 17:11:02.687779 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.687750 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xmbwz_f8fc4c1b-8a9b-474a-ab6d-391214553bf2/kube-rbac-proxy/0.log" Apr 16 17:11:02.711986 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:02.711969 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-xmbwz_f8fc4c1b-8a9b-474a-ab6d-391214553bf2/init-textfile/0.log" Apr 16 17:11:03.012761 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:03.012732 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-dd9tc_7c06b334-8334-4e0d-bece-cf0e4c09fd87/prometheus-operator/0.log" Apr 16 17:11:03.034323 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:03.034296 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-dd9tc_7c06b334-8334-4e0d-bece-cf0e4c09fd87/kube-rbac-proxy/0.log" Apr 16 17:11:03.098687 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:03.098651 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fd8c9f77b-cbb95_d0138511-2f89-4a9b-b233-0c90f6ac88af/telemeter-client/0.log" Apr 16 17:11:03.121828 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:03.121805 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fd8c9f77b-cbb95_d0138511-2f89-4a9b-b233-0c90f6ac88af/reload/0.log" Apr 16 17:11:03.143978 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:03.143946 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-5fd8c9f77b-cbb95_d0138511-2f89-4a9b-b233-0c90f6ac88af/kube-rbac-proxy/0.log" Apr 16 17:11:04.876769 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:04.876741 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/2.log" Apr 16 17:11:04.881306 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:04.881278 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-d87b8d5fc-d5cmb_9738e546-8b4d-4c0f-952e-9a361c4b5f7a/console-operator/3.log" Apr 16 17:11:05.652683 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:05.652656 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-bj9hp_2d432fd0-ec22-4d57-ac82-a36eb0170cb7/volume-data-source-validator/0.log" Apr 16 17:11:06.359688 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.359659 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p4bb2_3e2cc599-5a01-4b24-a80c-87b34418e1b6/dns/0.log" Apr 16 17:11:06.378828 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.378809 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-p4bb2_3e2cc599-5a01-4b24-a80c-87b34418e1b6/kube-rbac-proxy/0.log" Apr 16 17:11:06.445474 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.445456 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-v87xm_b60bf4ff-6c52-4d90-9a63-32a829bfc83e/dns-node-resolver/0.log" Apr 16 17:11:06.670632 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.670546 2568 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77"] Apr 16 17:11:06.670879 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.670866 2568 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" Apr 16 17:11:06.670937 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.670881 2568 state_mem.go:107] "Deleted CPUSet assignment" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" Apr 16 17:11:06.670973 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.670940 2568 memory_manager.go:356] "RemoveStaleState removing state" podUID="573278f1-6e5f-4c4c-b915-080b1194d410" containerName="model-chainer-raw-hpa-4ab2c" Apr 16 17:11:06.674337 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.674319 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.676554 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.676531 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fspv\"/\"openshift-service-ca.crt\"" Apr 16 17:11:06.676682 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.676556 2568 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-2fspv\"/\"default-dockercfg-f9wr2\"" Apr 16 17:11:06.677502 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.677485 2568 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-2fspv\"/\"kube-root-ca.crt\"" Apr 16 17:11:06.682665 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.682648 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77"] Apr 16 17:11:06.722850 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.722827 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-lib-modules\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.722951 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.722856 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-proc\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.722951 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.722873 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nfh2x\" (UniqueName: \"kubernetes.io/projected/f565eff4-7702-43e6-8129-6eb42f503461-kube-api-access-nfh2x\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.722951 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.722892 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-sys\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.723058 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.722980 2568 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-podres\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824124 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824100 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-lib-modules\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824220 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824131 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-proc\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824220 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824151 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nfh2x\" (UniqueName: \"kubernetes.io/projected/f565eff4-7702-43e6-8129-6eb42f503461-kube-api-access-nfh2x\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824220 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824167 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-sys\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824220 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824196 2568 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-podres\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824476 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824237 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-proc\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824476 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824246 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-sys\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824476 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824251 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-lib-modules\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.824476 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.824291 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f565eff4-7702-43e6-8129-6eb42f503461-podres\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.831418 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.831395 2568 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nfh2x\" (UniqueName: \"kubernetes.io/projected/f565eff4-7702-43e6-8129-6eb42f503461-kube-api-access-nfh2x\") pod \"perf-node-gather-daemonset-w9r77\" (UID: \"f565eff4-7702-43e6-8129-6eb42f503461\") " pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:06.891200 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.891178 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-8slrg_5d59ae73-e194-4f66-9f72-a091634a4c01/node-ca/0.log" Apr 16 17:11:06.985127 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:06.985109 2568 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:07.101217 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.101192 2568 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77"] Apr 16 17:11:07.103692 ip-10-0-128-130 kubenswrapper[2568]: W0416 17:11:07.103667 2568 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf565eff4_7702_43e6_8129_6eb42f503461.slice/crio-e8fa1876c66494e704b0cc12a088916d25b57a50762cc3e4f2d32a95fc8f56e8 WatchSource:0}: Error finding container e8fa1876c66494e704b0cc12a088916d25b57a50762cc3e4f2d32a95fc8f56e8: Status 404 returned error can't find the container with id e8fa1876c66494e704b0cc12a088916d25b57a50762cc3e4f2d32a95fc8f56e8 Apr 16 17:11:07.105341 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.105326 2568 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 17:11:07.623468 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.623438 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-65d8b4d484-2r794_41054f43-1b5b-46d6-9aab-24d8b6ff5a23/router/0.log" Apr 16 17:11:07.710223 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.710197 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" event={"ID":"f565eff4-7702-43e6-8129-6eb42f503461","Type":"ContainerStarted","Data":"9aa9b35fef44917fd8f571aaabd2902a62d1b7d75924ad4da482924f53a7648f"} Apr 16 17:11:07.710360 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.710229 2568 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" event={"ID":"f565eff4-7702-43e6-8129-6eb42f503461","Type":"ContainerStarted","Data":"e8fa1876c66494e704b0cc12a088916d25b57a50762cc3e4f2d32a95fc8f56e8"} Apr 16 17:11:07.710360 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.710259 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:07.725038 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.724995 2568 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" podStartSLOduration=1.7249830990000001 podStartE2EDuration="1.724983099s" podCreationTimestamp="2026-04-16 17:11:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 17:11:07.723614433 +0000 UTC m=+1400.932386735" watchObservedRunningTime="2026-04-16 17:11:07.724983099 +0000 UTC m=+1400.933755409" Apr 16 17:11:07.968542 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:07.968523 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-dcxhn_2a9408e4-8f5b-4f8e-b756-2d1f084e06a8/serve-healthcheck-canary/0.log" Apr 16 17:11:08.318250 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:08.318152 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-jjnfw_25b8c537-ca17-48e3-ab8a-0d08b4ff09b7/insights-operator/0.log" Apr 16 17:11:08.318433 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:08.318416 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-jjnfw_25b8c537-ca17-48e3-ab8a-0d08b4ff09b7/insights-operator/1.log" Apr 16 17:11:08.463726 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:08.463704 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tjmzf_406fd5cc-e864-49b6-a250-066e0a57b355/kube-rbac-proxy/0.log" Apr 16 17:11:08.486831 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:08.486808 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tjmzf_406fd5cc-e864-49b6-a250-066e0a57b355/exporter/0.log" Apr 16 17:11:08.509669 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:08.509646 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tjmzf_406fd5cc-e864-49b6-a250-066e0a57b355/extractor/0.log" Apr 16 17:11:10.401224 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:10.401190 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-vcvdm_0eb1c77a-a5aa-4112-bc53-ecb62776005c/manager/0.log" Apr 16 17:11:10.423798 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:10.423774 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-lfqdm_ed462eb3-83fa-424e-bdb9-39a75cc1c267/server/0.log" Apr 16 17:11:13.723034 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:13.723006 2568 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-2fspv/perf-node-gather-daemonset-w9r77" Apr 16 17:11:15.399099 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.399072 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/kube-multus-additional-cni-plugins/0.log" Apr 16 17:11:15.421303 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.421280 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/egress-router-binary-copy/0.log" Apr 16 17:11:15.442376 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.442358 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/cni-plugins/0.log" Apr 16 17:11:15.463256 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.463235 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/bond-cni-plugin/0.log" Apr 16 17:11:15.483447 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.483427 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/routeoverride-cni/0.log" Apr 16 17:11:15.504180 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.504158 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/whereabouts-cni-bincopy/0.log" Apr 16 17:11:15.524824 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.524803 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-hjnws_da105279-c3bd-4e13-9bfc-0331c0b3ebd0/whereabouts-cni/0.log" Apr 16 17:11:15.552155 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.552136 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-kznnf_6f46c58f-ae64-4611-b6f5-37bccf98d4af/kube-multus/0.log" Apr 16 17:11:15.710484 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.710465 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gd4q4_545dd230-1d90-4e1a-8615-072dd9b2d2f5/network-metrics-daemon/0.log" Apr 16 17:11:15.730712 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:15.730692 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-gd4q4_545dd230-1d90-4e1a-8615-072dd9b2d2f5/kube-rbac-proxy/0.log" Apr 16 17:11:16.543058 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.542985 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/ovn-controller/0.log" Apr 16 17:11:16.570876 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.570850 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/ovn-acl-logging/0.log" Apr 16 17:11:16.594558 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.594526 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/kube-rbac-proxy-node/0.log" Apr 16 17:11:16.616774 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.616750 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 17:11:16.636123 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.636085 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/northd/0.log" Apr 16 17:11:16.668032 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.668009 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/nbdb/0.log" Apr 16 17:11:16.690316 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.690293 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/sbdb/0.log" Apr 16 17:11:16.782656 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:16.782627 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5pfjs_2d7c70c5-3d64-495a-a048-265fbd988013/ovnkube-controller/0.log" Apr 16 17:11:18.348734 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:18.348705 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-7b678d77c7-zr45l_6ab091fc-21fb-49e0-b30a-43ff44f2e808/check-endpoints/0.log" Apr 16 17:11:18.394679 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:18.394651 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-bwvhw_02af3612-a84f-46c2-81c2-fe094b0b75f8/network-check-target-container/0.log" Apr 16 17:11:19.397298 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:19.397222 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-hsvxm_bd61ad6d-349b-4f2f-ba4c-bbe9aaf1fb28/iptables-alerter/0.log" Apr 16 17:11:20.032854 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:20.032828 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-fl64p_7fcf2d50-decf-4050-b3bc-a82043f228fe/tuned/0.log" Apr 16 17:11:23.312401 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:23.312369 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-29c96_635f4e42-2ddd-46d3-8a86-47f36f350728/csi-driver/0.log" Apr 16 17:11:23.333953 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:23.333929 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-29c96_635f4e42-2ddd-46d3-8a86-47f36f350728/csi-node-driver-registrar/0.log" Apr 16 17:11:23.355784 ip-10-0-128-130 kubenswrapper[2568]: I0416 17:11:23.355760 2568 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-csi-drivers_aws-ebs-csi-driver-node-29c96_635f4e42-2ddd-46d3-8a86-47f36f350728/csi-liveness-probe/0.log"