Apr 22 14:15:24.973037 ip-10-0-130-98 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:15:25.323445 ip-10-0-130-98 kubenswrapper[2575]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:25.323445 ip-10-0-130-98 kubenswrapper[2575]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:15:25.323445 ip-10-0-130-98 kubenswrapper[2575]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:25.323445 ip-10-0-130-98 kubenswrapper[2575]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:15:25.323445 ip-10-0-130-98 kubenswrapper[2575]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:15:25.325790 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.325698 2575 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:15:25.329776 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329762 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.329776 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329776 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329780 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329786 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329790 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329793 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329796 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329799 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329801 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329804 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329807 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329810 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329813 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329816 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329818 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329821 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329824 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329826 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329829 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329832 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.329844 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329834 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329837 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329839 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329842 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329844 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329847 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329850 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329852 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329855 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329858 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329861 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329863 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329867 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329871 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329874 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329877 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329879 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329882 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329884 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.330304 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329887 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329890 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329893 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329895 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329898 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329900 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329903 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329905 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329908 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329910 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329913 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329915 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329918 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329920 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329923 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329926 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329930 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329932 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329935 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329938 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.330828 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329947 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329949 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329952 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329955 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329957 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329960 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329963 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329965 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329968 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329970 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329973 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329975 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329978 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329980 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329983 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329985 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329988 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329990 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329993 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329995 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.331321 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.329998 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330000 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330002 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330005 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330007 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330011 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330013 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330381 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330386 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330389 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330395 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330398 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330401 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330404 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330406 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330409 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330411 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330414 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330417 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330420 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.331820 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330424 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330427 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330430 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330433 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330436 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330438 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330441 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330443 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330446 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330448 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330451 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330453 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330456 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330459 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330461 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330464 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330466 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330469 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330473 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330476 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.332302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330479 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330481 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330484 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330486 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330490 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330494 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330496 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330499 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330501 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330504 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330515 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330518 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330521 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330523 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330525 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330528 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330531 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330533 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330536 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330539 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.332818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330541 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330544 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330546 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330549 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330551 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330555 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330558 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330560 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330562 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330565 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330568 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330570 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330573 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330575 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330578 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330582 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330584 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330587 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330590 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330593 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.333318 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330595 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330598 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330600 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330602 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330605 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330607 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330609 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330612 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330614 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330617 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330619 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330622 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.330625 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331652 2575 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331661 2575 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331667 2575 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331672 2575 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331677 2575 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331680 2575 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331684 2575 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331688 2575 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:15:25.333817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331692 2575 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331695 2575 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331699 2575 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331702 2575 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331705 2575 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331708 2575 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331711 2575 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331714 2575 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331717 2575 flags.go:64] FLAG: --cloud-config="" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331720 2575 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331723 2575 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331728 2575 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331730 2575 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331733 2575 flags.go:64] FLAG: --config-dir="" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331736 2575 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331739 2575 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331743 2575 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331746 2575 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331761 2575 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331764 2575 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331767 2575 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331770 2575 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331773 2575 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331777 2575 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331780 2575 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:15:25.334333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331784 2575 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331787 2575 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331791 2575 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331793 2575 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331796 2575 flags.go:64] FLAG: --enable-server="true" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331799 2575 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331804 2575 flags.go:64] FLAG: --event-burst="100" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331808 2575 flags.go:64] FLAG: --event-qps="50" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331811 2575 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331814 2575 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331817 2575 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331821 2575 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331824 2575 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331827 2575 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331830 2575 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331833 2575 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331836 2575 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331839 2575 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331842 2575 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331845 2575 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331848 2575 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331851 2575 flags.go:64] FLAG: --feature-gates="" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331855 2575 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331858 2575 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331861 2575 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:15:25.334991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331864 2575 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331866 2575 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331869 2575 flags.go:64] FLAG: --help="false" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331872 2575 flags.go:64] FLAG: --hostname-override="ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331875 2575 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331878 2575 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331881 2575 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331884 2575 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331888 2575 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331891 2575 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331894 2575 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331897 2575 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331900 2575 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331903 2575 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331906 2575 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331909 2575 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331912 2575 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331915 2575 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331918 2575 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331920 2575 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331923 2575 flags.go:64] FLAG: --lock-file="" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331926 2575 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331928 2575 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331931 2575 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:15:25.335590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331937 2575 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331940 2575 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331943 2575 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331946 2575 flags.go:64] FLAG: --logging-format="text" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331949 2575 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331952 2575 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331955 2575 flags.go:64] FLAG: --manifest-url="" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331958 2575 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331962 2575 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331965 2575 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331969 2575 flags.go:64] FLAG: --max-pods="110" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331971 2575 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331974 2575 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331977 2575 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331980 2575 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331983 2575 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331986 2575 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331989 2575 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.331997 2575 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332000 2575 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332002 2575 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332005 2575 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332008 2575 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:15:25.336287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332014 2575 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332017 2575 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332020 2575 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332023 2575 flags.go:64] FLAG: --port="10250" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332026 2575 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332029 2575 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-04c5936e7b370cf3d" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332032 2575 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332035 2575 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332038 2575 flags.go:64] FLAG: --register-node="true" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332041 2575 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332043 2575 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332047 2575 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332050 2575 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332053 2575 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332056 2575 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332060 2575 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332063 2575 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332066 2575 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332070 2575 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332073 2575 flags.go:64] FLAG: --runonce="false" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332075 2575 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332078 2575 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332082 2575 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332084 2575 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332087 2575 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332090 2575 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:15:25.336865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332094 2575 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332097 2575 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332100 2575 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332103 2575 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332106 2575 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332109 2575 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332112 2575 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332115 2575 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332118 2575 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332123 2575 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332126 2575 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332129 2575 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332132 2575 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332135 2575 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332138 2575 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332141 2575 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332144 2575 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332147 2575 flags.go:64] FLAG: --v="2" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332151 2575 flags.go:64] FLAG: --version="false" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332155 2575 flags.go:64] FLAG: --vmodule="" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332160 2575 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332163 2575 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332253 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332257 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.337535 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332260 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332263 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332266 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332269 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332272 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332274 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332277 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332279 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332286 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332289 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332291 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332294 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332297 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332299 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332302 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332305 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332308 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332311 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332313 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332316 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.338130 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332319 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332321 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332324 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332327 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332329 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332332 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332334 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332337 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332340 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332343 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332345 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332348 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332350 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332353 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332355 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332358 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332360 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332363 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332365 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332368 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.338641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332372 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332375 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332377 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332379 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332382 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332384 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332390 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332392 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332395 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332398 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332400 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332403 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332405 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332408 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332411 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332414 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332416 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332419 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332421 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332424 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.339157 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332427 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332435 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332438 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332440 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332443 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332447 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332450 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332454 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332456 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332459 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332463 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332466 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332470 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332472 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332475 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332477 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332480 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332483 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332487 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.339649 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332489 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332492 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332494 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332497 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.332500 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.332506 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.338429 2575 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.338443 2575 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338488 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338494 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338497 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338500 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338504 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338515 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338519 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.340123 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338521 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338524 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338527 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338529 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338532 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338534 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338537 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338539 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338542 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338544 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338547 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338551 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338556 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338560 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338563 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338566 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338568 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338571 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338574 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.340499 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338576 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338579 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338582 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338584 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338587 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338591 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338594 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338597 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338600 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338602 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338605 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338608 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338611 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338613 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338616 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338619 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338621 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338624 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338626 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.341039 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338629 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338631 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338634 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338636 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338639 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338642 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338645 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338648 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338650 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338653 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338655 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338658 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338660 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338663 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338666 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338668 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338671 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338673 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338676 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338679 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.341500 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338682 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338685 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338688 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338690 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338693 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338695 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338699 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338703 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338705 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338708 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338711 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338713 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338716 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338718 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338721 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338723 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338726 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338728 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338731 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338733 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.342051 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338736 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.338741 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338848 2575 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338853 2575 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338856 2575 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338859 2575 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338862 2575 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338865 2575 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338868 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338870 2575 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338873 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338877 2575 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338880 2575 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338884 2575 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338888 2575 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:15:25.342576 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338891 2575 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338894 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338896 2575 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338899 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338902 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338905 2575 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338908 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338910 2575 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338913 2575 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338916 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338918 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338921 2575 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338924 2575 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338926 2575 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338929 2575 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338931 2575 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338934 2575 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338936 2575 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338939 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:15:25.342997 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338941 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338944 2575 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338946 2575 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338949 2575 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338952 2575 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338954 2575 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338956 2575 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338959 2575 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338961 2575 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338964 2575 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338967 2575 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338970 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338972 2575 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338974 2575 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338977 2575 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338979 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338982 2575 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338984 2575 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338987 2575 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338989 2575 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:15:25.343482 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338992 2575 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338994 2575 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338996 2575 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.338999 2575 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339001 2575 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339004 2575 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339006 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339009 2575 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339011 2575 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339014 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339016 2575 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339019 2575 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339021 2575 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339024 2575 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339026 2575 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339029 2575 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339031 2575 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339034 2575 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339037 2575 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:15:25.344125 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339041 2575 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339043 2575 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339046 2575 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339049 2575 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339051 2575 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339054 2575 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339057 2575 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339059 2575 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339061 2575 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339064 2575 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339066 2575 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339069 2575 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339071 2575 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339074 2575 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:25.339076 2575 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.339081 2575 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:15:25.344767 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.339698 2575 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:15:25.345353 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.341592 2575 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:15:25.345353 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.342341 2575 server.go:1019] "Starting client certificate rotation" Apr 22 14:15:25.345353 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.342447 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:25.345353 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.343724 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:15:25.366385 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.366368 2575 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:25.368828 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.368807 2575 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:15:25.385368 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.385349 2575 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:15:25.390045 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.390031 2575 log.go:25] "Validated CRI v1 image API" Apr 22 14:15:25.391216 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.391200 2575 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:15:25.394459 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.394439 2575 fs.go:135] Filesystem UUIDs: map[4fa20cfb-19e2-4992-8ed5-553e14ac8667:/dev/nvme0n1p3 6d31ae49-f004-486e-ab3c-2916cf823c42:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 22 14:15:25.394538 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.394458 2575 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:15:25.395003 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.394985 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:25.400922 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.400705 2575 manager.go:217] Machine: {Timestamp:2026-04-22 14:15:25.398644009 +0000 UTC m=+0.330022902 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097542 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec233065816f27c0e6b50a0e329567de SystemUUID:ec233065-816f-27c0-e6b5-0a0e329567de BootID:abe46ca6-2a84-4766-9366-de86fb776645 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:9a:82:65:b4:9d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:9a:82:65:b4:9d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:02:e7:f3:12:48:3a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:15:25.400922 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.400911 2575 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:15:25.401052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.400982 2575 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:15:25.401909 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.401888 2575 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:15:25.402034 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.401912 2575 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-98.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:15:25.402075 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.402043 2575 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:15:25.402075 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.402052 2575 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:15:25.402075 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.402065 2575 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:25.402697 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.402687 2575 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:15:25.403476 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.403465 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:25.403574 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.403566 2575 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:15:25.405501 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.405491 2575 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:15:25.405533 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.405511 2575 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:15:25.405533 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.405523 2575 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:15:25.405533 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.405531 2575 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:15:25.405628 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.405540 2575 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:15:25.406508 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.406486 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:25.406549 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.406522 2575 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:15:25.408969 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.408950 2575 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:15:25.410185 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.410170 2575 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:15:25.411863 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411851 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:15:25.411901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411870 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:15:25.411901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411876 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:15:25.411901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411881 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:15:25.411901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411887 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:15:25.411901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411894 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:15:25.411901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411901 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:15:25.412060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411910 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:15:25.412060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411919 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:15:25.412060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411926 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:15:25.412060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411935 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:15:25.412060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.411943 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:15:25.412593 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.412581 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:15:25.412593 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.412593 2575 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:15:25.414300 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.414274 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:15:25.414384 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.414280 2575 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:15:25.414497 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.414466 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9f5gb" Apr 22 14:15:25.416014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.415999 2575 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-98.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 14:15:25.416194 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.416184 2575 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:15:25.416231 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.416219 2575 server.go:1295] "Started kubelet" Apr 22 14:15:25.416331 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.416302 2575 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:15:25.416456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.416411 2575 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:15:25.416508 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.416474 2575 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:15:25.417057 ip-10-0-130-98 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:15:25.418587 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.418573 2575 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:15:25.420002 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.419984 2575 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:15:25.421318 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.421300 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9f5gb" Apr 22 14:15:25.425093 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.425074 2575 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:15:25.425702 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.425686 2575 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:25.426067 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.426050 2575 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:15:25.428677 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.428596 2575 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:15:25.428816 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.428760 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.428886 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.425904 2575 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-130-98.ec2.internal.18a8b36b106e41c6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-130-98.ec2.internal,UID:ip-10-0-130-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-130-98.ec2.internal,},FirstTimestamp:2026-04-22 14:15:25.416194502 +0000 UTC m=+0.347573395,LastTimestamp:2026-04-22 14:15:25.416194502 +0000 UTC m=+0.347573395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-130-98.ec2.internal,}" Apr 22 14:15:25.429028 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.428604 2575 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:15:25.429028 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429023 2575 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:15:25.429150 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429058 2575 factory.go:55] Registering systemd factory Apr 22 14:15:25.429150 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429081 2575 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:15:25.429150 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429100 2575 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:15:25.429150 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429110 2575 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:15:25.429430 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429410 2575 factory.go:153] Registering CRI-O factory Apr 22 14:15:25.429430 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429430 2575 factory.go:223] Registration of the crio container factory successfully Apr 22 14:15:25.429572 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429502 2575 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:15:25.429572 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429532 2575 factory.go:103] Registering Raw factory Apr 22 14:15:25.429572 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429548 2575 manager.go:1196] Started watching for new ooms in manager Apr 22 14:15:25.429940 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.429928 2575 manager.go:319] Starting recovery of all containers Apr 22 14:15:25.436841 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.436822 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:25.439190 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.439152 2575 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-98.ec2.internal\" not found" node="ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.440248 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.440232 2575 manager.go:324] Recovery completed Apr 22 14:15:25.442063 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.442041 2575 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 14:15:25.444736 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.444725 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.447076 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.447061 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.447159 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.447090 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.447159 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.447105 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.447581 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.447568 2575 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:15:25.447616 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.447581 2575 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:15:25.447616 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.447595 2575 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:15:25.449715 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.449704 2575 policy_none.go:49] "None policy: Start" Apr 22 14:15:25.449773 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.449718 2575 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:15:25.449773 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.449728 2575 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494284 2575 manager.go:341] "Starting Device Plugin manager" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.494333 2575 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494341 2575 server.go:85] "Starting device plugin registration server" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494553 2575 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494565 2575 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494690 2575 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494785 2575 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.494794 2575 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.495150 2575 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:15:25.502676 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.495186 2575 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.555271 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.555246 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:15:25.556389 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.556374 2575 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:15:25.556466 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.556400 2575 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:15:25.556466 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.556416 2575 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:15:25.556466 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.556422 2575 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:15:25.556466 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.556456 2575 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:15:25.559618 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.559600 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:25.595470 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.595416 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.596207 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.596193 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.596302 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.596223 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.596302 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.596237 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.596302 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.596263 2575 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.604376 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.604362 2575 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.604438 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.604381 2575 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-98.ec2.internal\": node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.619656 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.619636 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.657293 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.657262 2575 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal"] Apr 22 14:15:25.657362 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.657329 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.658110 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.658095 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.658180 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.658121 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.658180 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.658131 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.660416 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.660405 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.660543 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.660528 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.660593 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.660558 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.661844 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.661829 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.661916 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.661856 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.661916 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.661870 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.661916 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.661902 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.662018 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.661921 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.662018 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.661930 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.664031 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.664017 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.664103 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.664039 2575 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:25.665076 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.665062 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:25.665143 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.665090 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:25.665143 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.665104 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:25.691263 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.691242 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-98.ec2.internal\" not found" node="ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.695323 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.695309 2575 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-98.ec2.internal\" not found" node="ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.719726 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.719707 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.731055 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.731035 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6554fb3d2aa7e65e70eece3f844528b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"c6554fb3d2aa7e65e70eece3f844528b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.731116 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.731067 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6554fb3d2aa7e65e70eece3f844528b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"c6554fb3d2aa7e65e70eece3f844528b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.731116 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.731084 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/148a376889eed68cdbc344a2d41f35d5-config\") pod \"kube-apiserver-proxy-ip-10-0-130-98.ec2.internal\" (UID: \"148a376889eed68cdbc344a2d41f35d5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.820300 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.820279 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.831788 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.831769 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6554fb3d2aa7e65e70eece3f844528b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"c6554fb3d2aa7e65e70eece3f844528b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.831845 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.831791 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6554fb3d2aa7e65e70eece3f844528b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"c6554fb3d2aa7e65e70eece3f844528b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.831845 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.831808 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/148a376889eed68cdbc344a2d41f35d5-config\") pod \"kube-apiserver-proxy-ip-10-0-130-98.ec2.internal\" (UID: \"148a376889eed68cdbc344a2d41f35d5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.831921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.831850 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/148a376889eed68cdbc344a2d41f35d5-config\") pod \"kube-apiserver-proxy-ip-10-0-130-98.ec2.internal\" (UID: \"148a376889eed68cdbc344a2d41f35d5\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.831921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.831858 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/c6554fb3d2aa7e65e70eece3f844528b-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"c6554fb3d2aa7e65e70eece3f844528b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.831921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.831871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c6554fb3d2aa7e65e70eece3f844528b-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal\" (UID: \"c6554fb3d2aa7e65e70eece3f844528b\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.921132 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:25.921111 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:25.993716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.993689 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:25.998272 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:25.998254 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 22 14:15:26.022053 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:26.022027 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:26.122661 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:26.122625 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:26.223190 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:26.223138 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:26.323826 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:26.323802 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:26.343368 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.343347 2575 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:26.343475 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.343460 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:26.343536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.343497 2575 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:26.423508 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.423477 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:10:25 +0000 UTC" deadline="2028-02-08 07:28:24.777894888 +0000 UTC" Apr 22 14:15:26.423508 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.423504 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15761h12m58.354392841s" Apr 22 14:15:26.424436 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:26.424423 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:26.426004 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.425985 2575 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:26.436169 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.436152 2575 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:26.463650 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.463632 2575 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-dzx8t" Apr 22 14:15:26.471538 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.471516 2575 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-dzx8t" Apr 22 14:15:26.511762 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:26.511712 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc6554fb3d2aa7e65e70eece3f844528b.slice/crio-b5ce272df988c626c1b1f41df19a8f4c0a2b553dab24be4d0cc42cc9260709a5 WatchSource:0}: Error finding container b5ce272df988c626c1b1f41df19a8f4c0a2b553dab24be4d0cc42cc9260709a5: Status 404 returned error can't find the container with id b5ce272df988c626c1b1f41df19a8f4c0a2b553dab24be4d0cc42cc9260709a5 Apr 22 14:15:26.512299 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:26.512278 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod148a376889eed68cdbc344a2d41f35d5.slice/crio-325663d71be4aa497dea8d0f1588d2e0bfc6b8d67ec835a4ab07f5e662389c33 WatchSource:0}: Error finding container 325663d71be4aa497dea8d0f1588d2e0bfc6b8d67ec835a4ab07f5e662389c33: Status 404 returned error can't find the container with id 325663d71be4aa497dea8d0f1588d2e0bfc6b8d67ec835a4ab07f5e662389c33 Apr 22 14:15:26.515700 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.515685 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:26.525374 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:26.525351 2575 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-98.ec2.internal\" not found" Apr 22 14:15:26.559475 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.559442 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" event={"ID":"148a376889eed68cdbc344a2d41f35d5","Type":"ContainerStarted","Data":"325663d71be4aa497dea8d0f1588d2e0bfc6b8d67ec835a4ab07f5e662389c33"} Apr 22 14:15:26.560336 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.560317 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" event={"ID":"c6554fb3d2aa7e65e70eece3f844528b","Type":"ContainerStarted","Data":"b5ce272df988c626c1b1f41df19a8f4c0a2b553dab24be4d0cc42cc9260709a5"} Apr 22 14:15:26.597094 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.597073 2575 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:26.626418 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.626398 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" Apr 22 14:15:26.636626 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.636603 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:26.638044 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.638030 2575 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" Apr 22 14:15:26.647250 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.647231 2575 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:26.661355 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:26.661335 2575 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:27.407730 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.407701 2575 apiserver.go:52] "Watching apiserver" Apr 22 14:15:27.416120 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.416100 2575 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:27.417929 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.417898 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal","openshift-multus/multus-additional-cni-plugins-tn9p9","openshift-multus/network-metrics-daemon-swv2n","openshift-network-diagnostics/network-check-target-zvxdk","kube-system/konnectivity-agent-vj7vs","openshift-cluster-node-tuning-operator/tuned-2b68j","openshift-dns/node-resolver-9x8pf","openshift-image-registry/node-ca-97p4n","openshift-multus/multus-24sl8","openshift-network-operator/iptables-alerter-tc7f7","openshift-ovn-kubernetes/ovnkube-node-j58wd"] Apr 22 14:15:27.420266 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.420244 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.422418 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.422397 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.424848 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.424827 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:27.424956 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.424902 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:27.425071 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.424993 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8rdvj\"" Apr 22 14:15:27.425542 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.425446 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.427269 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.427244 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.427367 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.427331 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.427367 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.427339 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:27.427475 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.427365 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-74vdz\"" Apr 22 14:15:27.427762 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.427729 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:27.427852 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.427821 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:27.429815 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.429791 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:27.429902 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.429840 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.430286 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.430270 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.431021 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.430993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:27.431190 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.431173 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:27.431595 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.431575 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-4mvbk\"" Apr 22 14:15:27.432114 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.432097 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:27.432215 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.432164 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:27.432215 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.432199 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.434469 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.434452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.436683 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.436601 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.436788 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.436683 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.436788 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.436730 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.436952 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.436794 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-8tqhs\"" Apr 22 14:15:27.437203 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.437170 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.437817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.437798 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.438029 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.438011 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-972hd\"" Apr 22 14:15:27.439053 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.439027 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.439688 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.439552 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:27.439817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.439801 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.440193 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440171 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.440424 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440405 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fkrmd\"" Apr 22 14:15:27.440529 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440508 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jknsc\" (UniqueName: \"kubernetes.io/projected/1faf2ada-1177-442f-9ee9-4ecd9697e349-kube-api-access-jknsc\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:27.440598 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440556 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:27.440598 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440586 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc6918bb-f78e-49c2-a990-f709907bd409-agent-certs\") pod \"konnectivity-agent-vj7vs\" (UID: \"fc6918bb-f78e-49c2-a990-f709907bd409\") " pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.440698 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440609 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.440698 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440632 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.440698 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-cnibin\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.440866 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440725 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-cni-binary-copy\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.440866 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440791 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.440866 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440821 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc6918bb-f78e-49c2-a990-f709907bd409-konnectivity-ca\") pod \"konnectivity-agent-vj7vs\" (UID: \"fc6918bb-f78e-49c2-a990-f709907bd409\") " pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.440866 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440844 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-socket-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.441052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440868 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-device-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.441052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440889 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-etc-selinux\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.441052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440912 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-system-cni-dir\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.441052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440933 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-os-release\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.441052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.440981 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.441052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-registration-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.441396 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441057 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-sys-fs\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.441396 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441088 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzc9\" (UniqueName: \"kubernetes.io/projected/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-kube-api-access-kwzc9\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.441396 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441117 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wc8t\" (UniqueName: \"kubernetes.io/projected/766b2141-267e-41ed-bc88-fc000f360c08-kube-api-access-4wc8t\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.441396 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441139 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:27.441396 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441243 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.441704 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441687 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:27.441825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.441808 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-qnq4b\"" Apr 22 14:15:27.443790 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.443771 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.444572 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.444554 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-gn89g\"" Apr 22 14:15:27.444811 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.444783 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.444811 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.444785 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.444951 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.444831 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:27.447251 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.447231 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:27.448342 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.448315 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:27.448342 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.448333 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-dsq29\"" Apr 22 14:15:27.448899 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.448880 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:27.449432 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.449414 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:27.449529 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.449474 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:27.449645 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.449629 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:27.472462 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.472438 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:26 +0000 UTC" deadline="2027-12-09 09:51:08.359283204 +0000 UTC" Apr 22 14:15:27.472462 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.472462 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14299h35m40.886824621s" Apr 22 14:15:27.523913 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.523894 2575 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:27.529493 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.529472 2575 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:27.541452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541420 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-cni-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.541452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541445 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-kubelet\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.541603 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541463 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c9b4b54a-3a19-409c-818f-f465ef373376-multus-daemon-config\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.541603 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541479 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-node-log\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.541603 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541530 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-log-socket\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.541603 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541559 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-run-ovn-kubernetes\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541604 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-modprobe-d\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541630 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysconfig\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541649 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541666 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-lib-modules\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541686 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-cni-bin\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541700 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-conf-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8c828a22-de6c-4a15-a273-d749ea26c601-iptables-alerter-script\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541762 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-run-netns\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541773 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541801 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovnkube-config\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.541825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541820 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-sys\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541840 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-device-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541869 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541884 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-device-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541892 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541909 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-k8s-cni-cncf-io\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541924 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-etc-kubernetes\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541937 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-cni-netd\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541954 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-sys-fs\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.541978 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-multus-certs\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.541994 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542013 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qpzh\" (UniqueName: \"kubernetes.io/projected/c9b4b54a-3a19-409c-818f-f465ef373376-kube-api-access-9qpzh\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542048 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-sys-fs\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.542087 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:28.042065272 +0000 UTC m=+2.973444157 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-env-overrides\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc6918bb-f78e-49c2-a990-f709907bd409-agent-certs\") pod \"konnectivity-agent-vj7vs\" (UID: \"fc6918bb-f78e-49c2-a990-f709907bd409\") " pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.542324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542146 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-cni-multus\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542180 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-ovn\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542246 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovn-node-metrics-cert\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542276 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-cnibin\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542300 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9b4b54a-3a19-409c-818f-f465ef373376-cni-binary-copy\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542323 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-run\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542326 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-cnibin\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542365 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-etc-selinux\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542373 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542392 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-os-release\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542430 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-cnibin\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542461 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-etc-selinux\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542465 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542470 2575 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542498 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-host\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-os-release\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542519 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pw6tp\" (UniqueName: \"kubernetes.io/projected/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-kube-api-access-pw6tp\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.543106 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-registration-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542579 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzc9\" (UniqueName: \"kubernetes.io/projected/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-kube-api-access-kwzc9\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c828a22-de6c-4a15-a273-d749ea26c601-host-slash\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542640 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-slash\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542645 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-registration-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovnkube-script-lib\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542714 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-system-cni-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542734 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-socket-dir-parent\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542772 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-etc-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542802 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-cni-bin\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542825 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysctl-d\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542864 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542892 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysctl-conf\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542936 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-tmp-dir\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542965 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvgqk\" (UniqueName: \"kubernetes.io/projected/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-kube-api-access-cvgqk\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.542994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc6918bb-f78e-49c2-a990-f709907bd409-konnectivity-ca\") pod \"konnectivity-agent-vj7vs\" (UID: \"fc6918bb-f78e-49c2-a990-f709907bd409\") " pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.543921 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543033 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-system-cni-dir\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543061 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jknsc\" (UniqueName: \"kubernetes.io/projected/1faf2ada-1177-442f-9ee9-4ecd9697e349-kube-api-access-jknsc\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543087 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-os-release\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/766b2141-267e-41ed-bc88-fc000f360c08-system-cni-dir\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543115 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z2fmt\" (UniqueName: \"kubernetes.io/projected/8c828a22-de6c-4a15-a273-d749ea26c601-kube-api-access-z2fmt\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543166 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-var-lib-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543196 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-systemd\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543223 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4wc8t\" (UniqueName: \"kubernetes.io/projected/766b2141-267e-41ed-bc88-fc000f360c08-kube-api-access-4wc8t\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543248 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-kubelet\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543271 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-systemd-units\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543296 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543319 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-kubernetes\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543342 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-var-lib-kubelet\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543367 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-tuned\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543406 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-tmp\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.544599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543466 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-netns\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543497 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543516 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/fc6918bb-f78e-49c2-a990-f709907bd409-konnectivity-ca\") pod \"konnectivity-agent-vj7vs\" (UID: \"fc6918bb-f78e-49c2-a990-f709907bd409\") " pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543545 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-host\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543575 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-cni-binary-copy\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543602 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2ql2\" (UniqueName: \"kubernetes.io/projected/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-kube-api-access-l2ql2\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543626 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llz8q\" (UniqueName: \"kubernetes.io/projected/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-kube-api-access-llz8q\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-hosts-file\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543726 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-serviceca\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543745 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-socket-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543801 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543849 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-hostroot\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543861 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-socket-dir\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.543873 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-systemd\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.545303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.544012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/766b2141-267e-41ed-bc88-fc000f360c08-cni-binary-copy\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.545933 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.545348 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/fc6918bb-f78e-49c2-a990-f709907bd409-agent-certs\") pod \"konnectivity-agent-vj7vs\" (UID: \"fc6918bb-f78e-49c2-a990-f709907bd409\") " pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.552555 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.552533 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:27.552637 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.552561 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:27.552637 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.552574 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:27.552719 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:27.552645 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:28.052626761 +0000 UTC m=+2.984005659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:27.553548 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.553525 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzc9\" (UniqueName: \"kubernetes.io/projected/f8287b99-45a5-4575-bcba-3e21ee5f9ffc-kube-api-access-kwzc9\") pod \"aws-ebs-csi-driver-node-nwgkk\" (UID: \"f8287b99-45a5-4575-bcba-3e21ee5f9ffc\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.555050 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.555028 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jknsc\" (UniqueName: \"kubernetes.io/projected/1faf2ada-1177-442f-9ee9-4ecd9697e349-kube-api-access-jknsc\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:27.555878 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.555864 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wc8t\" (UniqueName: \"kubernetes.io/projected/766b2141-267e-41ed-bc88-fc000f360c08-kube-api-access-4wc8t\") pod \"multus-additional-cni-plugins-tn9p9\" (UID: \"766b2141-267e-41ed-bc88-fc000f360c08\") " pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.644330 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644298 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-cnibin\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.644330 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644331 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644347 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-host\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644363 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pw6tp\" (UniqueName: \"kubernetes.io/projected/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-kube-api-access-pw6tp\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644416 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-cnibin\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644425 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644418 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-host\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644469 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c828a22-de6c-4a15-a273-d749ea26c601-host-slash\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-slash\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovnkube-script-lib\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.644536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644534 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8c828a22-de6c-4a15-a273-d749ea26c601-host-slash\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644549 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-system-cni-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644563 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-slash\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-socket-dir-parent\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644620 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-system-cni-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644634 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-socket-dir-parent\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644646 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-etc-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644668 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-cni-bin\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644677 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-etc-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysctl-d\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644711 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysctl-conf\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644735 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-tmp-dir\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644774 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-cni-bin\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644777 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvgqk\" (UniqueName: \"kubernetes.io/projected/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-kube-api-access-cvgqk\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644784 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysctl-d\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-os-release\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644901 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z2fmt\" (UniqueName: \"kubernetes.io/projected/8c828a22-de6c-4a15-a273-d749ea26c601-kube-api-access-z2fmt\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644909 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-os-release\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.645001 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644867 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysctl-conf\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644930 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-var-lib-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644960 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-systemd\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.644961 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-var-lib-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-kubelet\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645035 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-systemd-units\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-kubelet\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovnkube-script-lib\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645012 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-systemd\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645061 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-systemd-units\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645086 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-kubernetes\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645095 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-openvswitch\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645109 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-var-lib-kubelet\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645124 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-tuned\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-kubernetes\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645144 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-tmp\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-tmp-dir\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.645855 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-netns\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645170 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-var-lib-kubelet\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645209 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-netns\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-host\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645296 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-host\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645310 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2ql2\" (UniqueName: \"kubernetes.io/projected/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-kube-api-access-l2ql2\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645346 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-llz8q\" (UniqueName: \"kubernetes.io/projected/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-kube-api-access-llz8q\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645366 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-hosts-file\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645388 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-serviceca\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645451 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-hosts-file\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645495 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-hostroot\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645519 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-systemd\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645546 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-cni-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645532 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-hostroot\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645571 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-kubelet\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645596 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c9b4b54a-3a19-409c-818f-f465ef373376-multus-daemon-config\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645603 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-systemd\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-node-log\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.646696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645638 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-cni-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-kubelet\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645643 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-log-socket\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645679 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-run-ovn-kubernetes\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-node-log\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-log-socket\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-modprobe-d\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645729 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-run-ovn-kubernetes\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645741 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysconfig\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645798 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-serviceca\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645804 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-sysconfig\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645834 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-lib-modules\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-cni-bin\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-conf-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645890 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-modprobe-d\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645905 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8c828a22-de6c-4a15-a273-d749ea26c601-iptables-alerter-script\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645925 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-cni-bin\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645931 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-run-netns\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.647550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645935 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-multus-conf-dir\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645907 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-lib-modules\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovnkube-config\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645972 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-run-netns\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.645991 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-sys\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646028 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-k8s-cni-cncf-io\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646055 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-etc-kubernetes\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646069 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-sys\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646079 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-cni-netd\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646105 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-multus-certs\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646127 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-k8s-cni-cncf-io\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646128 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qpzh\" (UniqueName: \"kubernetes.io/projected/c9b4b54a-3a19-409c-818f-f465ef373376-kube-api-access-9qpzh\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646165 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-env-overrides\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646195 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-cni-multus\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646220 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-ovn\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646244 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovn-node-metrics-cert\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9b4b54a-3a19-409c-818f-f465ef373376-cni-binary-copy\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646295 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-run\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.648410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646340 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-etc-kubernetes\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-run\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646388 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-host-cni-netd\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646424 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8c828a22-de6c-4a15-a273-d749ea26c601-iptables-alerter-script\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646432 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-run-multus-certs\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646464 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovnkube-config\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646481 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-run-ovn\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646519 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/c9b4b54a-3a19-409c-818f-f465ef373376-host-var-lib-cni-multus\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.646794 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-env-overrides\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.647001 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/c9b4b54a-3a19-409c-818f-f465ef373376-multus-daemon-config\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.647339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/c9b4b54a-3a19-409c-818f-f465ef373376-cni-binary-copy\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.647695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-tmp\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.648099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-etc-tuned\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.649246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.648656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-ovn-node-metrics-cert\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.653357 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.653332 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pw6tp\" (UniqueName: \"kubernetes.io/projected/058163d3-0e8a-40f7-aaa3-382fc9d4f5d4-kube-api-access-pw6tp\") pod \"node-ca-97p4n\" (UID: \"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4\") " pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.654022 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.653996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvgqk\" (UniqueName: \"kubernetes.io/projected/1dfd7d57-a9b2-4910-82a6-1e9bf8576804-kube-api-access-cvgqk\") pod \"node-resolver-9x8pf\" (UID: \"1dfd7d57-a9b2-4910-82a6-1e9bf8576804\") " pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.654438 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.654415 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z2fmt\" (UniqueName: \"kubernetes.io/projected/8c828a22-de6c-4a15-a273-d749ea26c601-kube-api-access-z2fmt\") pod \"iptables-alerter-tc7f7\" (UID: \"8c828a22-de6c-4a15-a273-d749ea26c601\") " pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.654806 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.654782 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qpzh\" (UniqueName: \"kubernetes.io/projected/c9b4b54a-3a19-409c-818f-f465ef373376-kube-api-access-9qpzh\") pod \"multus-24sl8\" (UID: \"c9b4b54a-3a19-409c-818f-f465ef373376\") " pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.654967 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.654948 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-llz8q\" (UniqueName: \"kubernetes.io/projected/7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329-kube-api-access-llz8q\") pod \"tuned-2b68j\" (UID: \"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329\") " pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.655174 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.655155 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2ql2\" (UniqueName: \"kubernetes.io/projected/3403a015-2d45-42e8-bf6e-9a0bc6d91e99-kube-api-access-l2ql2\") pod \"ovnkube-node-j58wd\" (UID: \"3403a015-2d45-42e8-bf6e-9a0bc6d91e99\") " pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.733220 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.733166 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:27.741870 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.741850 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" Apr 22 14:15:27.750277 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.750261 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" Apr 22 14:15:27.756849 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.756830 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-2b68j" Apr 22 14:15:27.761393 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.761375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-9x8pf" Apr 22 14:15:27.767840 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.767821 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-97p4n" Apr 22 14:15:27.774396 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.774378 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-24sl8" Apr 22 14:15:27.780873 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.780857 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-tc7f7" Apr 22 14:15:27.785567 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.785548 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:27.901960 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:27.901941 2575 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:28.049924 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.049857 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:28.050046 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:28.049964 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:28.050046 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:28.050012 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:29.049999665 +0000 UTC m=+3.981378545 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:28.113423 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.113400 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod058163d3_0e8a_40f7_aaa3_382fc9d4f5d4.slice/crio-41623af4d509f23ea104fd2d9956030d67028b5500719065814820132a2c8adc WatchSource:0}: Error finding container 41623af4d509f23ea104fd2d9956030d67028b5500719065814820132a2c8adc: Status 404 returned error can't find the container with id 41623af4d509f23ea104fd2d9956030d67028b5500719065814820132a2c8adc Apr 22 14:15:28.114825 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.114806 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dfdfa3e_08a0_4ac8_89e1_2cbf687b5329.slice/crio-5434cf9408be8d81e7d1af473f6645d9dccd6f9f232bb97c734406a7631a7363 WatchSource:0}: Error finding container 5434cf9408be8d81e7d1af473f6645d9dccd6f9f232bb97c734406a7631a7363: Status 404 returned error can't find the container with id 5434cf9408be8d81e7d1af473f6645d9dccd6f9f232bb97c734406a7631a7363 Apr 22 14:15:28.118279 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.118250 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod766b2141_267e_41ed_bc88_fc000f360c08.slice/crio-0d0e8334733dea72b1c92bbc40e5541293cd1c3ac042109a67b9c3a7c0b80c7a WatchSource:0}: Error finding container 0d0e8334733dea72b1c92bbc40e5541293cd1c3ac042109a67b9c3a7c0b80c7a: Status 404 returned error can't find the container with id 0d0e8334733dea72b1c92bbc40e5541293cd1c3ac042109a67b9c3a7c0b80c7a Apr 22 14:15:28.118959 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.118938 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dfd7d57_a9b2_4910_82a6_1e9bf8576804.slice/crio-c6182bb9eceb07db2c31b8e5655b9d7d7012cc9b8366b292fcf80983dbb28e6b WatchSource:0}: Error finding container c6182bb9eceb07db2c31b8e5655b9d7d7012cc9b8366b292fcf80983dbb28e6b: Status 404 returned error can't find the container with id c6182bb9eceb07db2c31b8e5655b9d7d7012cc9b8366b292fcf80983dbb28e6b Apr 22 14:15:28.119807 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.119740 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8c828a22_de6c_4a15_a273_d749ea26c601.slice/crio-785eb038a6527eb13c66e7526da6c30a8fd4d8dd32957e1ede9475af8dd93262 WatchSource:0}: Error finding container 785eb038a6527eb13c66e7526da6c30a8fd4d8dd32957e1ede9475af8dd93262: Status 404 returned error can't find the container with id 785eb038a6527eb13c66e7526da6c30a8fd4d8dd32957e1ede9475af8dd93262 Apr 22 14:15:28.120504 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.120463 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc6918bb_f78e_49c2_a990_f709907bd409.slice/crio-1e362a35c075855f0e325486791dc6e7b9ec8f03eb3c135317d2dcaac11457dc WatchSource:0}: Error finding container 1e362a35c075855f0e325486791dc6e7b9ec8f03eb3c135317d2dcaac11457dc: Status 404 returned error can't find the container with id 1e362a35c075855f0e325486791dc6e7b9ec8f03eb3c135317d2dcaac11457dc Apr 22 14:15:28.121851 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.121468 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8287b99_45a5_4575_bcba_3e21ee5f9ffc.slice/crio-653e7020d7c044c5fde96c4effe69fd64dc36a5e759c1e8d9c2db3d6d2e1d76f WatchSource:0}: Error finding container 653e7020d7c044c5fde96c4effe69fd64dc36a5e759c1e8d9c2db3d6d2e1d76f: Status 404 returned error can't find the container with id 653e7020d7c044c5fde96c4effe69fd64dc36a5e759c1e8d9c2db3d6d2e1d76f Apr 22 14:15:28.122558 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.122455 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc9b4b54a_3a19_409c_818f_f465ef373376.slice/crio-f3bc4aedcabf11ad0183197174c5cbb089ceaa3b8774c04b9730eb21a5bddf6b WatchSource:0}: Error finding container f3bc4aedcabf11ad0183197174c5cbb089ceaa3b8774c04b9730eb21a5bddf6b: Status 404 returned error can't find the container with id f3bc4aedcabf11ad0183197174c5cbb089ceaa3b8774c04b9730eb21a5bddf6b Apr 22 14:15:28.124068 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:15:28.123969 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3403a015_2d45_42e8_bf6e_9a0bc6d91e99.slice/crio-76c46150eb71544e71a65f9a318918dd27a864d4cb4283fcf8d5de2d38607428 WatchSource:0}: Error finding container 76c46150eb71544e71a65f9a318918dd27a864d4cb4283fcf8d5de2d38607428: Status 404 returned error can't find the container with id 76c46150eb71544e71a65f9a318918dd27a864d4cb4283fcf8d5de2d38607428 Apr 22 14:15:28.150206 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.150185 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:28.150355 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:28.150313 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:28.150355 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:28.150329 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:28.150355 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:28.150340 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:28.150468 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:28.150386 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:29.150372317 +0000 UTC m=+4.081751198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:28.473393 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.473333 2575 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:26 +0000 UTC" deadline="2028-02-05 17:01:53.4614428 +0000 UTC" Apr 22 14:15:28.473393 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.473370 2575 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15698h46m24.988076977s" Apr 22 14:15:28.570240 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.570164 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-24sl8" event={"ID":"c9b4b54a-3a19-409c-818f-f465ef373376","Type":"ContainerStarted","Data":"f3bc4aedcabf11ad0183197174c5cbb089ceaa3b8774c04b9730eb21a5bddf6b"} Apr 22 14:15:28.573337 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.573250 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" event={"ID":"f8287b99-45a5-4575-bcba-3e21ee5f9ffc","Type":"ContainerStarted","Data":"653e7020d7c044c5fde96c4effe69fd64dc36a5e759c1e8d9c2db3d6d2e1d76f"} Apr 22 14:15:28.575192 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.575140 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tc7f7" event={"ID":"8c828a22-de6c-4a15-a273-d749ea26c601","Type":"ContainerStarted","Data":"785eb038a6527eb13c66e7526da6c30a8fd4d8dd32957e1ede9475af8dd93262"} Apr 22 14:15:28.580588 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.580537 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9x8pf" event={"ID":"1dfd7d57-a9b2-4910-82a6-1e9bf8576804","Type":"ContainerStarted","Data":"c6182bb9eceb07db2c31b8e5655b9d7d7012cc9b8366b292fcf80983dbb28e6b"} Apr 22 14:15:28.589431 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.589386 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerStarted","Data":"0d0e8334733dea72b1c92bbc40e5541293cd1c3ac042109a67b9c3a7c0b80c7a"} Apr 22 14:15:28.593381 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.593319 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2b68j" event={"ID":"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329","Type":"ContainerStarted","Data":"5434cf9408be8d81e7d1af473f6645d9dccd6f9f232bb97c734406a7631a7363"} Apr 22 14:15:28.605018 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.604978 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" event={"ID":"148a376889eed68cdbc344a2d41f35d5","Type":"ContainerStarted","Data":"3241d8a2d2fb00e0df4f12928691dbcd8eaf0d629873dd5d87fe599c52452b06"} Apr 22 14:15:28.608674 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.608650 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"76c46150eb71544e71a65f9a318918dd27a864d4cb4283fcf8d5de2d38607428"} Apr 22 14:15:28.620009 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.619984 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vj7vs" event={"ID":"fc6918bb-f78e-49c2-a990-f709907bd409","Type":"ContainerStarted","Data":"1e362a35c075855f0e325486791dc6e7b9ec8f03eb3c135317d2dcaac11457dc"} Apr 22 14:15:28.631556 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:28.621867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-97p4n" event={"ID":"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4","Type":"ContainerStarted","Data":"41623af4d509f23ea104fd2d9956030d67028b5500719065814820132a2c8adc"} Apr 22 14:15:29.057602 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.057573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:29.057719 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.057697 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:29.057788 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.057772 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:31.057734987 +0000 UTC m=+5.989113873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:29.159386 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.158802 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:29.159386 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.158972 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:29.159386 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.158992 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:29.159386 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.159004 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:29.159386 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.159061 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:31.159041993 +0000 UTC m=+6.090420887 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:29.557328 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.557300 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:29.557809 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.557432 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:29.558026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.557987 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:29.558098 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:29.558081 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:29.650582 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.650551 2575 generic.go:358] "Generic (PLEG): container finished" podID="c6554fb3d2aa7e65e70eece3f844528b" containerID="d0492a8d0f8f0183273e4e2c50e0ccff4607b35f04daa4bc08d323ff11a6b827" exitCode=0 Apr 22 14:15:29.650726 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.650689 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" event={"ID":"c6554fb3d2aa7e65e70eece3f844528b","Type":"ContainerDied","Data":"d0492a8d0f8f0183273e4e2c50e0ccff4607b35f04daa4bc08d323ff11a6b827"} Apr 22 14:15:29.668103 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:29.668059 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-98.ec2.internal" podStartSLOduration=3.668042092 podStartE2EDuration="3.668042092s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:28.620130255 +0000 UTC m=+3.551509160" watchObservedRunningTime="2026-04-22 14:15:29.668042092 +0000 UTC m=+4.599420998" Apr 22 14:15:30.657640 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:30.657606 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" event={"ID":"c6554fb3d2aa7e65e70eece3f844528b","Type":"ContainerStarted","Data":"2adebf0275072ae3ccc7a2084853e642d24675a06d8424c285fb2bc056de10a5"} Apr 22 14:15:31.074866 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:31.074780 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:31.075010 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.074944 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:31.075010 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.075003 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:35.074985888 +0000 UTC m=+10.006364782 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:31.176096 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:31.176060 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:31.176259 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.176204 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:31.176259 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.176223 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:31.176259 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.176236 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:31.176412 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.176304 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:35.176285448 +0000 UTC m=+10.107664333 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:31.557342 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:31.556823 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:31.557342 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.556969 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:31.557342 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:31.557074 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:31.557342 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:31.557172 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:33.557454 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:33.557235 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:33.557454 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:33.557377 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:33.557454 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:33.557424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:33.558095 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:33.557489 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:35.110059 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:35.109996 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:35.110501 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.110121 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:35.110501 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.110185 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:43.110164779 +0000 UTC m=+18.041543673 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:35.210483 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:35.210455 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:35.210633 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.210580 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:35.210633 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.210594 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:35.210633 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.210603 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:35.210827 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.210643 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:43.210630191 +0000 UTC m=+18.142009072 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:35.559035 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:35.558497 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:35.559035 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.558612 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:35.559035 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:35.558974 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:35.559438 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:35.559056 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:37.557401 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:37.557370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:37.557850 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:37.557370 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:37.557850 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:37.557490 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:37.557850 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:37.557595 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:39.556985 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:39.556954 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:39.557431 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:39.556986 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:39.557431 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:39.557061 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:39.557431 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:39.557202 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:41.556916 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:41.556880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:41.557337 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:41.556880 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:41.557337 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:41.557022 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:41.557337 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:41.557109 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:43.172310 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:43.172275 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:43.172648 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.172418 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:43.172648 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.172477 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.172462741 +0000 UTC m=+34.103841622 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:43.273043 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:43.273005 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:43.273193 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.273172 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:43.273269 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.273197 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:43.273269 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.273210 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:43.273269 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.273264 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.273249049 +0000 UTC m=+34.204627933 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:43.557648 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:43.557581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:43.557837 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:43.557581 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:43.557934 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.557681 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:43.557934 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:43.557848 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:45.558416 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.558143 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:45.559287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.558240 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:45.559287 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:45.558496 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:45.559287 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:45.558626 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:45.682280 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.682232 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-2b68j" event={"ID":"7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329","Type":"ContainerStarted","Data":"2a10c827aff6154a52e4fe9e0de73db7dd9efa9a6b2c6433fe262f0a6c366d19"} Apr 22 14:15:45.685302 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685284 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:15:45.685636 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685590 2575 generic.go:358] "Generic (PLEG): container finished" podID="3403a015-2d45-42e8-bf6e-9a0bc6d91e99" containerID="4107799d2c9cecc52634db96c864c23f642a434fa8e04aa0a8b9b3d7bb3976cf" exitCode=1 Apr 22 14:15:45.685706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685660 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"f83c0dd6b58b97810c9d344c4634bf09677cb02107df286303c394d754ed6ae6"} Apr 22 14:15:45.685706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685684 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"fb275ec78f0c3165f41e330c567d2b351c2d7f47b636cb06541f41aee5925386"} Apr 22 14:15:45.685706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685694 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"2ba44977a337cd8933b449fa5cd4943b2cc6f3657264c4dc94898d05bbc46ff6"} Apr 22 14:15:45.685706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685703 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerDied","Data":"4107799d2c9cecc52634db96c864c23f642a434fa8e04aa0a8b9b3d7bb3976cf"} Apr 22 14:15:45.685902 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.685714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"03c033c5392c18b628b061193e73b4c893718a9371f6371dc19f36138db62943"} Apr 22 14:15:45.686972 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.686938 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vj7vs" event={"ID":"fc6918bb-f78e-49c2-a990-f709907bd409","Type":"ContainerStarted","Data":"60b17e81468b5171d7d9b62c8ba10b8d9d8fdc37dc981493c9132f6d8e00aa8d"} Apr 22 14:15:45.688251 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.688224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-97p4n" event={"ID":"058163d3-0e8a-40f7-aaa3-382fc9d4f5d4","Type":"ContainerStarted","Data":"aeffccfcce41be4bf5b43d1bf1086e0f7c1a8dd78aaf17ac5e1d3b6d7fbe0d98"} Apr 22 14:15:45.689693 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.689669 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-24sl8" event={"ID":"c9b4b54a-3a19-409c-818f-f465ef373376","Type":"ContainerStarted","Data":"23ea1d0f0c7981587c2c5051adfe03c21b28a05b04842fbd779e79a8a9c023fd"} Apr 22 14:15:45.690922 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.690903 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" event={"ID":"f8287b99-45a5-4575-bcba-3e21ee5f9ffc","Type":"ContainerStarted","Data":"26951bc619b0cb677d01ed2fc4fd2419f7d472d4b2690fc10fc8cfa7c37193fb"} Apr 22 14:15:45.692100 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.692080 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-9x8pf" event={"ID":"1dfd7d57-a9b2-4910-82a6-1e9bf8576804","Type":"ContainerStarted","Data":"b171e66dce7aa42f1ac16e833d31342e0edb0f6f414db957fb26259a8d9b7696"} Apr 22 14:15:45.693597 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.693543 2575 generic.go:358] "Generic (PLEG): container finished" podID="766b2141-267e-41ed-bc88-fc000f360c08" containerID="e213c9f082619df23fc153cde2d6ade4d9152e8972af453ae045a5a61def1cc4" exitCode=0 Apr 22 14:15:45.693597 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.693576 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerDied","Data":"e213c9f082619df23fc153cde2d6ade4d9152e8972af453ae045a5a61def1cc4"} Apr 22 14:15:45.700916 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.700871 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-98.ec2.internal" podStartSLOduration=19.700855785 podStartE2EDuration="19.700855785s" podCreationTimestamp="2026-04-22 14:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:30.674019455 +0000 UTC m=+5.605398359" watchObservedRunningTime="2026-04-22 14:15:45.700855785 +0000 UTC m=+20.632234689" Apr 22 14:15:45.701152 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.701129 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-2b68j" podStartSLOduration=3.86270134 podStartE2EDuration="20.701123877s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.117087557 +0000 UTC m=+3.048466441" lastFinishedPulling="2026-04-22 14:15:44.955510081 +0000 UTC m=+19.886888978" observedRunningTime="2026-04-22 14:15:45.700143076 +0000 UTC m=+20.631521980" watchObservedRunningTime="2026-04-22 14:15:45.701123877 +0000 UTC m=+20.632502780" Apr 22 14:15:45.715006 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.714941 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-9x8pf" podStartSLOduration=3.880476249 podStartE2EDuration="20.714931276s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.120942709 +0000 UTC m=+3.052321593" lastFinishedPulling="2026-04-22 14:15:44.955397727 +0000 UTC m=+19.886776620" observedRunningTime="2026-04-22 14:15:45.714362865 +0000 UTC m=+20.645741768" watchObservedRunningTime="2026-04-22 14:15:45.714931276 +0000 UTC m=+20.646310179" Apr 22 14:15:45.733856 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.733485 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-97p4n" podStartSLOduration=3.893201733 podStartE2EDuration="20.733472652s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.115657629 +0000 UTC m=+3.047036517" lastFinishedPulling="2026-04-22 14:15:44.955928555 +0000 UTC m=+19.887307436" observedRunningTime="2026-04-22 14:15:45.733073765 +0000 UTC m=+20.664452669" watchObservedRunningTime="2026-04-22 14:15:45.733472652 +0000 UTC m=+20.664851554" Apr 22 14:15:45.774132 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.774091 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-24sl8" podStartSLOduration=3.910119007 podStartE2EDuration="20.774078254s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.124820103 +0000 UTC m=+3.056198985" lastFinishedPulling="2026-04-22 14:15:44.988779341 +0000 UTC m=+19.920158232" observedRunningTime="2026-04-22 14:15:45.773966101 +0000 UTC m=+20.705345006" watchObservedRunningTime="2026-04-22 14:15:45.774078254 +0000 UTC m=+20.705457161" Apr 22 14:15:45.958928 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.958899 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:45.959457 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.959434 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:45.974576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:45.974510 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vj7vs" podStartSLOduration=4.141207674 podStartE2EDuration="20.974497232s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.122334526 +0000 UTC m=+3.053713413" lastFinishedPulling="2026-04-22 14:15:44.955624089 +0000 UTC m=+19.887002971" observedRunningTime="2026-04-22 14:15:45.793181359 +0000 UTC m=+20.724560265" watchObservedRunningTime="2026-04-22 14:15:45.974497232 +0000 UTC m=+20.905876116" Apr 22 14:15:46.609718 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:46.609538 2575 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:46.697433 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:46.697385 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" event={"ID":"f8287b99-45a5-4575-bcba-3e21ee5f9ffc","Type":"ContainerStarted","Data":"4cd46fbcccf2868219a803e2c3235c77cca3d74c521caa5bb683f7ddcfce3ff6"} Apr 22 14:15:46.698800 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:46.698771 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-tc7f7" event={"ID":"8c828a22-de6c-4a15-a273-d749ea26c601","Type":"ContainerStarted","Data":"b9eadf3165de0266a14eac93c5ae5c6c4997f280f7c8a56bc3e8bd5550c3e76a"} Apr 22 14:15:46.701430 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:46.701407 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:15:46.702096 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:46.701849 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"03da28a2f97a5311d401adff255fa163a24ff0db792073013c7130decf1deb22"} Apr 22 14:15:46.714491 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:46.714452 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-tc7f7" podStartSLOduration=4.881117055 podStartE2EDuration="21.714438482s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.121968337 +0000 UTC m=+3.053347234" lastFinishedPulling="2026-04-22 14:15:44.955289764 +0000 UTC m=+19.886668661" observedRunningTime="2026-04-22 14:15:46.714413603 +0000 UTC m=+21.645792509" watchObservedRunningTime="2026-04-22 14:15:46.714438482 +0000 UTC m=+21.645817387" Apr 22 14:15:47.007225 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.007194 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:47.008064 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.008038 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vj7vs" Apr 22 14:15:47.505339 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.505241 2575 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:46.609714265Z","UUID":"15d9fa86-dc81-4e87-ab2f-3357f38b2b37","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:47.507232 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.507212 2575 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:47.507232 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.507242 2575 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:47.557208 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.557187 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:47.557208 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:47.557204 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:47.557396 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:47.557376 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:47.557510 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:47.557482 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:48.707414 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:48.707378 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" event={"ID":"f8287b99-45a5-4575-bcba-3e21ee5f9ffc","Type":"ContainerStarted","Data":"d5c179d4270b147ea865d62f3d023a4ac6a373b49abff612dd28396ac12b40e3"} Apr 22 14:15:48.710328 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:48.710308 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:15:48.710710 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:48.710685 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"1ea9c1364fa0e1efcce2e9c27c7018d188b954d15cbe924c36ee322858971a45"} Apr 22 14:15:48.728564 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:48.728487 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nwgkk" podStartSLOduration=4.233280553 podStartE2EDuration="23.726077874s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.123398512 +0000 UTC m=+3.054777408" lastFinishedPulling="2026-04-22 14:15:47.616195848 +0000 UTC m=+22.547574729" observedRunningTime="2026-04-22 14:15:48.725933496 +0000 UTC m=+23.657312401" watchObservedRunningTime="2026-04-22 14:15:48.726077874 +0000 UTC m=+23.657456759" Apr 22 14:15:49.557097 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:49.557070 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:49.557274 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:49.557142 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:49.557274 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:49.557247 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:49.557428 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:49.557397 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:50.715872 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.715842 2575 generic.go:358] "Generic (PLEG): container finished" podID="766b2141-267e-41ed-bc88-fc000f360c08" containerID="e0eec76c063b9ca0af7b2173ce310513cc098ff42deef2a7b9debed5c0bdd086" exitCode=0 Apr 22 14:15:50.716377 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.715932 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerDied","Data":"e0eec76c063b9ca0af7b2173ce310513cc098ff42deef2a7b9debed5c0bdd086"} Apr 22 14:15:50.719013 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.718995 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:15:50.719329 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.719313 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"9654c855e89739f41e9844ab096b2db7300eb60559ce118f8d0f41aef532c042"} Apr 22 14:15:50.719567 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.719548 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:50.719694 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.719681 2575 scope.go:117] "RemoveContainer" containerID="4107799d2c9cecc52634db96c864c23f642a434fa8e04aa0a8b9b3d7bb3976cf" Apr 22 14:15:50.734333 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:50.734316 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:51.557404 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.557379 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:51.557501 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.557410 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:51.560967 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:51.557808 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:51.560967 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:51.557977 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:51.723594 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.723505 2575 generic.go:358] "Generic (PLEG): container finished" podID="766b2141-267e-41ed-bc88-fc000f360c08" containerID="69f3ef025658e72e7853fdecf66d763eaba6e0e344b863c70cfc7304e584e743" exitCode=0 Apr 22 14:15:51.724116 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.723583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerDied","Data":"69f3ef025658e72e7853fdecf66d763eaba6e0e344b863c70cfc7304e584e743"} Apr 22 14:15:51.729865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.729828 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:15:51.730311 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.730269 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" event={"ID":"3403a015-2d45-42e8-bf6e-9a0bc6d91e99","Type":"ContainerStarted","Data":"158a4dad4f340de34c98f5942a846492e412302468e55cce1fcfb3dab279a4cc"} Apr 22 14:15:51.730487 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.730469 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:51.730782 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.730743 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:51.746215 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.746198 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:51.786746 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.786596 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" podStartSLOduration=9.853095548 podStartE2EDuration="26.786581738s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.125879104 +0000 UTC m=+3.057257986" lastFinishedPulling="2026-04-22 14:15:45.059365284 +0000 UTC m=+19.990744176" observedRunningTime="2026-04-22 14:15:51.785085189 +0000 UTC m=+26.716464093" watchObservedRunningTime="2026-04-22 14:15:51.786581738 +0000 UTC m=+26.717960638" Apr 22 14:15:51.979357 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.979281 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zvxdk"] Apr 22 14:15:51.979471 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.979399 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:51.979536 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:51.979508 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:51.982036 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.982011 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-swv2n"] Apr 22 14:15:51.982139 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:51.982126 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:51.982256 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:51.982235 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:52.733710 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:52.733679 2575 generic.go:358] "Generic (PLEG): container finished" podID="766b2141-267e-41ed-bc88-fc000f360c08" containerID="1cdee1872344ff893638e96250c8aa2243a15d05d0999cf3fbf7e79380dc0467" exitCode=0 Apr 22 14:15:52.734230 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:52.733714 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerDied","Data":"1cdee1872344ff893638e96250c8aa2243a15d05d0999cf3fbf7e79380dc0467"} Apr 22 14:15:52.734230 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:52.734015 2575 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:53.020043 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:53.019986 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:15:53.559889 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:53.559863 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:53.560050 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:53.559897 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:53.560050 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:53.559964 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:53.560158 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:53.560102 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:55.557340 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:55.557308 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:55.557896 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:55.557416 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:55.557896 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:55.557462 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:55.557896 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:55.557531 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:57.557460 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.557423 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:57.557958 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.557424 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:57.557958 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:57.557563 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:15:57.557958 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:57.557597 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-zvxdk" podUID="25952960-59a7-4c77-9fc4-71e746c78539" Apr 22 14:15:57.897587 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.897419 2575 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-98.ec2.internal" event="NodeReady" Apr 22 14:15:57.897775 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.897697 2575 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:57.947642 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.947617 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-l4vdv"] Apr 22 14:15:57.979141 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.979112 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-hq58r"] Apr 22 14:15:57.979292 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.979198 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:57.982252 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.982229 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:57.982252 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.982244 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fpnmw\"" Apr 22 14:15:57.982444 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.982229 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:57.999900 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.999881 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l4vdv"] Apr 22 14:15:57.999992 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.999905 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hq58r"] Apr 22 14:15:58.000049 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:57.999994 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.003526 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.003489 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:58.003526 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.003489 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:58.003906 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.003886 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:58.004006 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.003950 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-slf7l\"" Apr 22 14:15:58.091088 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.091060 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.091241 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.091109 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b515945c-4a63-4512-9132-79ffc9f58ef0-tmp-dir\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.091241 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.091168 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm9dm\" (UniqueName: \"kubernetes.io/projected/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-kube-api-access-lm9dm\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.091241 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.091193 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b515945c-4a63-4512-9132-79ffc9f58ef0-config-volume\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.091241 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.091218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.091433 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.091262 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwvzl\" (UniqueName: \"kubernetes.io/projected/b515945c-4a63-4512-9132-79ffc9f58ef0-kube-api-access-lwvzl\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.192401 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwvzl\" (UniqueName: \"kubernetes.io/projected/b515945c-4a63-4512-9132-79ffc9f58ef0-kube-api-access-lwvzl\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.192401 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192371 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.192605 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192424 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b515945c-4a63-4512-9132-79ffc9f58ef0-tmp-dir\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.192605 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192470 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lm9dm\" (UniqueName: \"kubernetes.io/projected/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-kube-api-access-lm9dm\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.192605 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b515945c-4a63-4512-9132-79ffc9f58ef0-config-volume\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.192605 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.192517 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:58.192605 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.192589 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.692568945 +0000 UTC m=+33.623947852 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:15:58.192945 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.192632 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:58.192945 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192521 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.192945 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.192694 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:58.69267678 +0000 UTC m=+33.624055677 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:58.192945 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.192840 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b515945c-4a63-4512-9132-79ffc9f58ef0-tmp-dir\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.193242 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.193221 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b515945c-4a63-4512-9132-79ffc9f58ef0-config-volume\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.204654 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.204622 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwvzl\" (UniqueName: \"kubernetes.io/projected/b515945c-4a63-4512-9132-79ffc9f58ef0-kube-api-access-lwvzl\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.204808 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.204790 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm9dm\" (UniqueName: \"kubernetes.io/projected/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-kube-api-access-lm9dm\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.696603 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.696569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:58.697200 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.696700 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:58.697200 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:58.696719 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:58.697200 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.696783 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.696762353 +0000 UTC m=+34.628141252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:58.697200 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.696864 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:58.697200 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:58.696937 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:15:59.69692062 +0000 UTC m=+34.628299505 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:15:59.200925 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.200890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:59.201108 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.201054 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:59.201175 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.201125 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:31.201110801 +0000 UTC m=+66.132489692 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:59.302221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.302183 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:59.302395 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.302344 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:59.302395 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.302361 2575 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:59.302395 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.302371 2575 projected.go:194] Error preparing data for projected volume kube-api-access-p5smh for pod openshift-network-diagnostics/network-check-target-zvxdk: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:59.302562 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.302443 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh podName:25952960-59a7-4c77-9fc4-71e746c78539 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:31.302424338 +0000 UTC m=+66.233803227 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-p5smh" (UniqueName: "kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh") pod "network-check-target-zvxdk" (UID: "25952960-59a7-4c77-9fc4-71e746c78539") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:59.557194 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.557109 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:15:59.557357 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.557314 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:15:59.562383 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.562356 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:59.562528 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.562509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:59.562656 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.562640 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-r76nx\"" Apr 22 14:15:59.562827 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.562805 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:59.562952 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.562938 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-77xtv\"" Apr 22 14:15:59.705651 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.705620 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:15:59.706103 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:15:59.705674 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:15:59.706103 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.705775 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:59.706103 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.705829 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.705814506 +0000 UTC m=+36.637193388 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:15:59.706103 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.705974 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:59.706103 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:15:59.706052 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:01.706034315 +0000 UTC m=+36.637413204 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:00.754308 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:00.754229 2575 generic.go:358] "Generic (PLEG): container finished" podID="766b2141-267e-41ed-bc88-fc000f360c08" containerID="6a6e8ce9d297a957f61f56027a2122b4873aa294e7d2bd99226947e5791c0bf1" exitCode=0 Apr 22 14:16:00.754308 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:00.754271 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerDied","Data":"6a6e8ce9d297a957f61f56027a2122b4873aa294e7d2bd99226947e5791c0bf1"} Apr 22 14:16:01.721073 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:01.721045 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:16:01.721240 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:01.721095 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:16:01.721240 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:01.721182 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:01.721240 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:01.721213 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:01.721240 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:01.721240 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.721224319 +0000 UTC m=+40.652603199 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:01.721405 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:01.721263 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.721246351 +0000 UTC m=+40.652625235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:16:01.758625 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:01.758558 2575 generic.go:358] "Generic (PLEG): container finished" podID="766b2141-267e-41ed-bc88-fc000f360c08" containerID="5a9a58547694764d61b5fe9f48b8876f2ea90c7dc99ee7be426c60658c655d8e" exitCode=0 Apr 22 14:16:01.758625 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:01.758604 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerDied","Data":"5a9a58547694764d61b5fe9f48b8876f2ea90c7dc99ee7be426c60658c655d8e"} Apr 22 14:16:02.763044 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:02.763012 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" event={"ID":"766b2141-267e-41ed-bc88-fc000f360c08","Type":"ContainerStarted","Data":"5cb8af75158812af38044f18965c2051d3de436e9765053cb04cfd861b76b7d9"} Apr 22 14:16:02.792549 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:02.792503 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tn9p9" podStartSLOduration=5.733681742 podStartE2EDuration="37.792489302s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:15:28.119909776 +0000 UTC m=+3.051288656" lastFinishedPulling="2026-04-22 14:16:00.178717322 +0000 UTC m=+35.110096216" observedRunningTime="2026-04-22 14:16:02.791013456 +0000 UTC m=+37.722392355" watchObservedRunningTime="2026-04-22 14:16:02.792489302 +0000 UTC m=+37.723868197" Apr 22 14:16:05.747085 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:05.747042 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:16:05.747085 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:05.747096 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:16:05.747487 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:05.747187 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:05.747487 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:05.747251 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.747237032 +0000 UTC m=+48.678615912 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:16:05.747487 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:05.747187 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:05.747487 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:05.747314 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:13.747302886 +0000 UTC m=+48.678681772 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:13.797976 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:13.797941 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:16:13.798470 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:13.798000 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:16:13.798470 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:13.798085 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:13.798470 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:13.798090 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:13.798470 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:13.798142 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:29.798127274 +0000 UTC m=+64.729506154 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:13.798470 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:13.798167 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:16:29.798147637 +0000 UTC m=+64.729526519 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:16:24.747357 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:24.747330 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-j58wd" Apr 22 14:16:29.896191 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:29.896155 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:16:29.896191 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:29.896199 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:16:29.896626 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:29.896298 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:29.896626 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:29.896304 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:29.896626 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:29.896349 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:17:01.896335096 +0000 UTC m=+96.827713976 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:16:29.896626 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:29.896369 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:01.896353909 +0000 UTC m=+96.827732789 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:31.204892 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.204856 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:16:31.208070 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.208048 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:31.215629 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:31.215602 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:31.215696 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:16:31.215683 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:35.215662435 +0000 UTC m=+130.147041316 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : secret "metrics-daemon-secret" not found Apr 22 14:16:31.305831 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.305805 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:16:31.308936 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.308918 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:31.319297 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.319276 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:31.329684 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.329656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5smh\" (UniqueName: \"kubernetes.io/projected/25952960-59a7-4c77-9fc4-71e746c78539-kube-api-access-p5smh\") pod \"network-check-target-zvxdk\" (UID: \"25952960-59a7-4c77-9fc4-71e746c78539\") " pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:16:31.379709 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.379681 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-77xtv\"" Apr 22 14:16:31.387242 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.387226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:16:31.514881 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.514854 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-zvxdk"] Apr 22 14:16:31.517740 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:16:31.517711 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25952960_59a7_4c77_9fc4_71e746c78539.slice/crio-8fb905c348c5f2388284f01ac0009ef53066041ce6f847b16aebf101b3c9a723 WatchSource:0}: Error finding container 8fb905c348c5f2388284f01ac0009ef53066041ce6f847b16aebf101b3c9a723: Status 404 returned error can't find the container with id 8fb905c348c5f2388284f01ac0009ef53066041ce6f847b16aebf101b3c9a723 Apr 22 14:16:31.816851 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:31.816782 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zvxdk" event={"ID":"25952960-59a7-4c77-9fc4-71e746c78539","Type":"ContainerStarted","Data":"8fb905c348c5f2388284f01ac0009ef53066041ce6f847b16aebf101b3c9a723"} Apr 22 14:16:34.822534 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:34.822500 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-zvxdk" event={"ID":"25952960-59a7-4c77-9fc4-71e746c78539","Type":"ContainerStarted","Data":"e6f1239d15c120d7195d66326c84ef0916e340e8e3d29b707ea035c90134dd15"} Apr 22 14:16:34.822974 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:34.822602 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:16:34.840668 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:16:34.840626 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-zvxdk" podStartSLOduration=66.749894104 podStartE2EDuration="1m9.84061185s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:16:31.519598321 +0000 UTC m=+66.450977203" lastFinishedPulling="2026-04-22 14:16:34.610316064 +0000 UTC m=+69.541694949" observedRunningTime="2026-04-22 14:16:34.840160733 +0000 UTC m=+69.771539637" watchObservedRunningTime="2026-04-22 14:16:34.84061185 +0000 UTC m=+69.771990753" Apr 22 14:17:01.900547 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:01.900524 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:17:01.900960 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:01.900565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:17:01.900960 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:01.900651 2575 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:17:01.900960 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:01.900655 2575 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:17:01.900960 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:01.900710 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert podName:9971a5a9-34ef-4f3c-9183-340e4c5fde1c nodeName:}" failed. No retries permitted until 2026-04-22 14:18:05.900693763 +0000 UTC m=+160.832072661 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert") pod "ingress-canary-hq58r" (UID: "9971a5a9-34ef-4f3c-9183-340e4c5fde1c") : secret "canary-serving-cert" not found Apr 22 14:17:01.900960 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:01.900722 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls podName:b515945c-4a63-4512-9132-79ffc9f58ef0 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:05.900717011 +0000 UTC m=+160.832095891 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls") pod "dns-default-l4vdv" (UID: "b515945c-4a63-4512-9132-79ffc9f58ef0") : secret "dns-default-metrics-tls" not found Apr 22 14:17:05.827573 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:05.827545 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-zvxdk" Apr 22 14:17:35.309764 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:35.309703 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:17:35.310360 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:35.309875 2575 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:35.310360 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:35.309971 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs podName:1faf2ada-1177-442f-9ee9-4ecd9697e349 nodeName:}" failed. No retries permitted until 2026-04-22 14:19:37.3099501 +0000 UTC m=+252.241328985 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs") pod "network-metrics-daemon-swv2n" (UID: "1faf2ada-1177-442f-9ee9-4ecd9697e349") : secret "metrics-daemon-secret" not found Apr 22 14:17:41.117416 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.117361 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9"] Apr 22 14:17:41.120174 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.120151 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.120560 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.120528 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r"] Apr 22 14:17:41.123019 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.123002 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 14:17:41.123131 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.123116 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.123883 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.123864 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.123985 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.123867 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 14:17:41.124341 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.124326 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-tfbkl\"" Apr 22 14:17:41.124419 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.124344 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.128347 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.128329 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 22 14:17:41.128347 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.128341 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.128530 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.128399 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.129048 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.129031 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-zcnkf\"" Apr 22 14:17:41.129556 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.129535 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9"] Apr 22 14:17:41.137785 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.137765 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r"] Apr 22 14:17:41.224478 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.224459 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj"] Apr 22 14:17:41.227189 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.227176 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" Apr 22 14:17:41.229929 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.229902 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.230032 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.229993 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.230215 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.230193 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-69kbb\"" Apr 22 14:17:41.240505 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.239654 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj"] Apr 22 14:17:41.245398 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.245379 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c9f80e-9e6b-4087-b460-c87423e02659-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.245480 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.245405 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5t22t\" (UniqueName: \"kubernetes.io/projected/f4c9f80e-9e6b-4087-b460-c87423e02659-kube-api-access-5t22t\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.245480 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.245438 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.245480 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.245453 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp9nz\" (UniqueName: \"kubernetes.io/projected/bd3338c3-d102-49e4-905b-c457dea46629-kube-api-access-kp9nz\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.245578 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.245491 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c9f80e-9e6b-4087-b460-c87423e02659-config\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.324926 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.324904 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql"] Apr 22 14:17:41.327548 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.327535 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" Apr 22 14:17:41.330659 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.330641 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-h2dms\"" Apr 22 14:17:41.336849 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.336829 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql"] Apr 22 14:17:41.346588 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.346570 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qjgw6\" (UniqueName: \"kubernetes.io/projected/fa3ac17e-df93-4459-ba78-9173887dc2e3-kube-api-access-qjgw6\") pod \"volume-data-source-validator-7c6cbb6c87-lsmjj\" (UID: \"fa3ac17e-df93-4459-ba78-9173887dc2e3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" Apr 22 14:17:41.346693 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.346612 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c9f80e-9e6b-4087-b460-c87423e02659-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.346693 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.346644 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5t22t\" (UniqueName: \"kubernetes.io/projected/f4c9f80e-9e6b-4087-b460-c87423e02659-kube-api-access-5t22t\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.346827 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.346698 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.346827 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.346722 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kp9nz\" (UniqueName: \"kubernetes.io/projected/bd3338c3-d102-49e4-905b-c457dea46629-kube-api-access-kp9nz\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.346827 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.346761 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c9f80e-9e6b-4087-b460-c87423e02659-config\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.346827 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.346809 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:41.347022 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.346873 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls podName:bd3338c3-d102-49e4-905b-c457dea46629 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.846854287 +0000 UTC m=+136.778233187 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gcj4r" (UID: "bd3338c3-d102-49e4-905b-c457dea46629") : secret "samples-operator-tls" not found Apr 22 14:17:41.347242 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.347222 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4c9f80e-9e6b-4087-b460-c87423e02659-config\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.348779 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.348761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f4c9f80e-9e6b-4087-b460-c87423e02659-serving-cert\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.357073 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.357043 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp9nz\" (UniqueName: \"kubernetes.io/projected/bd3338c3-d102-49e4-905b-c457dea46629-kube-api-access-kp9nz\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.357444 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.357423 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5t22t\" (UniqueName: \"kubernetes.io/projected/f4c9f80e-9e6b-4087-b460-c87423e02659-kube-api-access-5t22t\") pod \"service-ca-operator-d6fc45fc5-9zbk9\" (UID: \"f4c9f80e-9e6b-4087-b460-c87423e02659\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.423906 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.423884 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2lphb"] Apr 22 14:17:41.426727 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.426713 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.429266 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.429252 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 22 14:17:41.429343 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.429279 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 22 14:17:41.429502 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.429484 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:41.429556 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.429535 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-kkh8q\"" Apr 22 14:17:41.429762 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.429735 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:41.430041 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.430025 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" Apr 22 14:17:41.431136 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.431115 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b48cf5d46-hdtb2"] Apr 22 14:17:41.434006 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.433976 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.436471 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.436451 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:17:41.436781 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.436760 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:17:41.436781 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.436770 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:17:41.436898 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.436807 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 22 14:17:41.436898 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.436810 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-xbchn\"" Apr 22 14:17:41.440338 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.440268 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2lphb"] Apr 22 14:17:41.441984 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.441962 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:17:41.447560 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.447540 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qjgw6\" (UniqueName: \"kubernetes.io/projected/fa3ac17e-df93-4459-ba78-9173887dc2e3-kube-api-access-qjgw6\") pod \"volume-data-source-validator-7c6cbb6c87-lsmjj\" (UID: \"fa3ac17e-df93-4459-ba78-9173887dc2e3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" Apr 22 14:17:41.447653 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.447639 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8c9k\" (UniqueName: \"kubernetes.io/projected/fc27862a-fe8f-4d00-8591-85e1878bef5a-kube-api-access-w8c9k\") pod \"network-check-source-8894fc9bd-4vkql\" (UID: \"fc27862a-fe8f-4d00-8591-85e1878bef5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" Apr 22 14:17:41.450076 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.450057 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b48cf5d46-hdtb2"] Apr 22 14:17:41.458888 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.458871 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qjgw6\" (UniqueName: \"kubernetes.io/projected/fa3ac17e-df93-4459-ba78-9173887dc2e3-kube-api-access-qjgw6\") pod \"volume-data-source-validator-7c6cbb6c87-lsmjj\" (UID: \"fa3ac17e-df93-4459-ba78-9173887dc2e3\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" Apr 22 14:17:41.535217 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.535190 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" Apr 22 14:17:41.540501 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.540473 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9"] Apr 22 14:17:41.544620 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:17:41.544559 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4c9f80e_9e6b_4087_b460_c87423e02659.slice/crio-aaec0635f56c1fb98f0b74d5cdc2568d92ed3b2eb9ac239d0043c790a07dbbbd WatchSource:0}: Error finding container aaec0635f56c1fb98f0b74d5cdc2568d92ed3b2eb9ac239d0043c790a07dbbbd: Status 404 returned error can't find the container with id aaec0635f56c1fb98f0b74d5cdc2568d92ed3b2eb9ac239d0043c790a07dbbbd Apr 22 14:17:41.547965 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.547944 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1963949-20fe-4269-990d-7aa35e49f968-ca-trust-extracted\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548050 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.547989 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-image-registry-private-configuration\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548050 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548018 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-installation-pull-secrets\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548050 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548041 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6dzh4\" (UniqueName: \"kubernetes.io/projected/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-kube-api-access-6dzh4\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548083 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548104 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-registry-certificates\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548128 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-bound-sa-token\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548148 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2kkj\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-kube-api-access-n2kkj\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548163 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-serving-cert\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548179 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-trusted-ca\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.548221 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548210 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-trusted-ca\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.548565 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548345 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w8c9k\" (UniqueName: \"kubernetes.io/projected/fc27862a-fe8f-4d00-8591-85e1878bef5a-kube-api-access-w8c9k\") pod \"network-check-source-8894fc9bd-4vkql\" (UID: \"fc27862a-fe8f-4d00-8591-85e1878bef5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" Apr 22 14:17:41.548565 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.548373 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-config\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.558715 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.558681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8c9k\" (UniqueName: \"kubernetes.io/projected/fc27862a-fe8f-4d00-8591-85e1878bef5a-kube-api-access-w8c9k\") pod \"network-check-source-8894fc9bd-4vkql\" (UID: \"fc27862a-fe8f-4d00-8591-85e1878bef5a\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" Apr 22 14:17:41.635159 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.635134 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" Apr 22 14:17:41.645211 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.645127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj"] Apr 22 14:17:41.649255 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1963949-20fe-4269-990d-7aa35e49f968-ca-trust-extracted\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649321 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649299 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-image-registry-private-configuration\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649359 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-installation-pull-secrets\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649415 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649358 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6dzh4\" (UniqueName: \"kubernetes.io/projected/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-kube-api-access-6dzh4\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.649415 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649391 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649507 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649415 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-registry-certificates\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649507 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649440 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-bound-sa-token\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649507 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649465 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2kkj\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-kube-api-access-n2kkj\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649507 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649489 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-serving-cert\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.649695 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649511 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-trusted-ca\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.649695 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649559 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-trusted-ca\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649695 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649637 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-config\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.649875 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.649770 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1963949-20fe-4269-990d-7aa35e49f968-ca-trust-extracted\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.649928 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.649874 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:41.649928 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.649889 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b48cf5d46-hdtb2: secret "image-registry-tls" not found Apr 22 14:17:41.650026 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.649971 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls podName:c1963949-20fe-4269-990d-7aa35e49f968 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:42.149935934 +0000 UTC m=+137.081314829 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls") pod "image-registry-6b48cf5d46-hdtb2" (UID: "c1963949-20fe-4269-990d-7aa35e49f968") : secret "image-registry-tls" not found Apr 22 14:17:41.650361 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.650339 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-registry-certificates\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.650460 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.650385 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-config\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.651162 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.651131 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-trusted-ca\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.651958 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.651938 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-trusted-ca\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.652416 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.652398 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-image-registry-private-configuration\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.652719 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.652656 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-installation-pull-secrets\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.653957 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.653860 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-serving-cert\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.658359 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.658338 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-bound-sa-token\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.658630 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.658607 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2kkj\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-kube-api-access-n2kkj\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:41.658714 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.658635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6dzh4\" (UniqueName: \"kubernetes.io/projected/ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa-kube-api-access-6dzh4\") pod \"console-operator-9d4b6777b-2lphb\" (UID: \"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa\") " pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.735642 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.735612 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:41.744590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.744562 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql"] Apr 22 14:17:41.747113 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:17:41.747082 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc27862a_fe8f_4d00_8591_85e1878bef5a.slice/crio-09c1ca6c766ff7e2ac1003c7f325bb9c824853f895cae1ce8b04ab73fd06da63 WatchSource:0}: Error finding container 09c1ca6c766ff7e2ac1003c7f325bb9c824853f895cae1ce8b04ab73fd06da63: Status 404 returned error can't find the container with id 09c1ca6c766ff7e2ac1003c7f325bb9c824853f895cae1ce8b04ab73fd06da63 Apr 22 14:17:41.852365 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.851827 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:41.852365 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.852005 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:41.852365 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:41.852068 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls podName:bd3338c3-d102-49e4-905b-c457dea46629 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:42.852049551 +0000 UTC m=+137.783428433 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gcj4r" (UID: "bd3338c3-d102-49e4-905b-c457dea46629") : secret "samples-operator-tls" not found Apr 22 14:17:41.853220 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.853195 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-2lphb"] Apr 22 14:17:41.856418 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:17:41.856393 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podffe149ac_1ad0_48e9_9e0c_461c55ebc4fa.slice/crio-70958e859c47965ec9710d7242f844014c8f7c3d0ebecbeb89f6b25e7d1dd64c WatchSource:0}: Error finding container 70958e859c47965ec9710d7242f844014c8f7c3d0ebecbeb89f6b25e7d1dd64c: Status 404 returned error can't find the container with id 70958e859c47965ec9710d7242f844014c8f7c3d0ebecbeb89f6b25e7d1dd64c Apr 22 14:17:41.952142 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.952075 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" event={"ID":"f4c9f80e-9e6b-4087-b460-c87423e02659","Type":"ContainerStarted","Data":"aaec0635f56c1fb98f0b74d5cdc2568d92ed3b2eb9ac239d0043c790a07dbbbd"} Apr 22 14:17:41.953681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.953657 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" event={"ID":"fc27862a-fe8f-4d00-8591-85e1878bef5a","Type":"ContainerStarted","Data":"1e0bcc6d0e8a8bf7733df8a23f38d5878db403955ecef27a8995cb8db080cfb6"} Apr 22 14:17:41.953801 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.953690 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" event={"ID":"fc27862a-fe8f-4d00-8591-85e1878bef5a","Type":"ContainerStarted","Data":"09c1ca6c766ff7e2ac1003c7f325bb9c824853f895cae1ce8b04ab73fd06da63"} Apr 22 14:17:41.955156 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.955103 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" event={"ID":"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa","Type":"ContainerStarted","Data":"70958e859c47965ec9710d7242f844014c8f7c3d0ebecbeb89f6b25e7d1dd64c"} Apr 22 14:17:41.956249 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.956224 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" event={"ID":"fa3ac17e-df93-4459-ba78-9173887dc2e3","Type":"ContainerStarted","Data":"66a6394f37a6c4a14c6d3fb3eaf987c62fbbb25d71ee6e2ca9353ba4518616dd"} Apr 22 14:17:41.972511 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:41.972474 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-4vkql" podStartSLOduration=0.972464159 podStartE2EDuration="972.464159ms" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:41.97212402 +0000 UTC m=+136.903502925" watchObservedRunningTime="2026-04-22 14:17:41.972464159 +0000 UTC m=+136.903843059" Apr 22 14:17:42.153508 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:42.153447 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:42.153945 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:42.153592 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:42.153945 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:42.153611 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b48cf5d46-hdtb2: secret "image-registry-tls" not found Apr 22 14:17:42.153945 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:42.153675 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls podName:c1963949-20fe-4269-990d-7aa35e49f968 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:43.153657335 +0000 UTC m=+138.085036234 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls") pod "image-registry-6b48cf5d46-hdtb2" (UID: "c1963949-20fe-4269-990d-7aa35e49f968") : secret "image-registry-tls" not found Apr 22 14:17:42.860000 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:42.859963 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:42.860229 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:42.860134 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:42.860229 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:42.860217 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls podName:bd3338c3-d102-49e4-905b-c457dea46629 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:44.860195943 +0000 UTC m=+139.791574833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gcj4r" (UID: "bd3338c3-d102-49e4-905b-c457dea46629") : secret "samples-operator-tls" not found Apr 22 14:17:43.162379 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:43.162343 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:43.162830 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:43.162510 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:43.162830 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:43.162530 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b48cf5d46-hdtb2: secret "image-registry-tls" not found Apr 22 14:17:43.162830 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:43.162594 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls podName:c1963949-20fe-4269-990d-7aa35e49f968 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:45.162575961 +0000 UTC m=+140.093954851 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls") pod "image-registry-6b48cf5d46-hdtb2" (UID: "c1963949-20fe-4269-990d-7aa35e49f968") : secret "image-registry-tls" not found Apr 22 14:17:44.874728 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.874692 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:44.875104 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:44.874844 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:44.875104 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:44.874907 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls podName:bd3338c3-d102-49e4-905b-c457dea46629 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:48.874890969 +0000 UTC m=+143.806269855 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gcj4r" (UID: "bd3338c3-d102-49e4-905b-c457dea46629") : secret "samples-operator-tls" not found Apr 22 14:17:44.967346 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.967322 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/0.log" Apr 22 14:17:44.967478 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.967357 2575 generic.go:358] "Generic (PLEG): container finished" podID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" containerID="c1e0c86ba09499c771c824925c4b28c26cede028df049915f533f3cbf9cd3207" exitCode=255 Apr 22 14:17:44.967478 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.967415 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" event={"ID":"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa","Type":"ContainerDied","Data":"c1e0c86ba09499c771c824925c4b28c26cede028df049915f533f3cbf9cd3207"} Apr 22 14:17:44.967622 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.967605 2575 scope.go:117] "RemoveContainer" containerID="c1e0c86ba09499c771c824925c4b28c26cede028df049915f533f3cbf9cd3207" Apr 22 14:17:44.968814 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.968795 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" event={"ID":"fa3ac17e-df93-4459-ba78-9173887dc2e3","Type":"ContainerStarted","Data":"f17568db5bd2e02a7abd1dba2e25c9ce7eae44d506b81d15311919b091bfd51d"} Apr 22 14:17:44.969992 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:44.969969 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" event={"ID":"f4c9f80e-9e6b-4087-b460-c87423e02659","Type":"ContainerStarted","Data":"844bf22e6f996cc0b45d27b384fdd248ce977901a30f8131adf925fdba09f609"} Apr 22 14:17:45.091670 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.091631 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" podStartSLOduration=1.617862429 podStartE2EDuration="4.091618358s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.546485352 +0000 UTC m=+136.477864234" lastFinishedPulling="2026-04-22 14:17:44.020241274 +0000 UTC m=+138.951620163" observedRunningTime="2026-04-22 14:17:45.091092925 +0000 UTC m=+140.022471827" watchObservedRunningTime="2026-04-22 14:17:45.091618358 +0000 UTC m=+140.022997260" Apr 22 14:17:45.177021 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.176998 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:45.177123 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:45.177107 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:45.177229 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:45.177122 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b48cf5d46-hdtb2: secret "image-registry-tls" not found Apr 22 14:17:45.177229 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:45.177171 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls podName:c1963949-20fe-4269-990d-7aa35e49f968 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:49.177156188 +0000 UTC m=+144.108535070 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls") pod "image-registry-6b48cf5d46-hdtb2" (UID: "c1963949-20fe-4269-990d-7aa35e49f968") : secret "image-registry-tls" not found Apr 22 14:17:45.973460 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.973432 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/1.log" Apr 22 14:17:45.973853 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.973835 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/0.log" Apr 22 14:17:45.973897 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.973866 2575 generic.go:358] "Generic (PLEG): container finished" podID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" containerID="5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7" exitCode=255 Apr 22 14:17:45.973989 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.973965 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" event={"ID":"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa","Type":"ContainerDied","Data":"5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7"} Apr 22 14:17:45.974026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.974010 2575 scope.go:117] "RemoveContainer" containerID="c1e0c86ba09499c771c824925c4b28c26cede028df049915f533f3cbf9cd3207" Apr 22 14:17:45.974316 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.974282 2575 scope.go:117] "RemoveContainer" containerID="5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7" Apr 22 14:17:45.974481 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:45.974446 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2lphb_openshift-console-operator(ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" podUID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" Apr 22 14:17:45.992721 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.992686 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-lsmjj" podStartSLOduration=2.62750197 podStartE2EDuration="4.99267586s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.652321625 +0000 UTC m=+136.583700507" lastFinishedPulling="2026-04-22 14:17:44.017495513 +0000 UTC m=+138.948874397" observedRunningTime="2026-04-22 14:17:45.171779051 +0000 UTC m=+140.103157953" watchObservedRunningTime="2026-04-22 14:17:45.99267586 +0000 UTC m=+140.924054763" Apr 22 14:17:45.992875 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.992862 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r"] Apr 22 14:17:45.995819 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.995806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:45.999151 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.999129 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 14:17:45.999254 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.999150 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-rlm69\"" Apr 22 14:17:45.999408 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:45.999391 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 14:17:46.018272 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.018254 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r"] Apr 22 14:17:46.084067 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.084045 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:46.084172 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.084082 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:46.185269 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.185249 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:46.185339 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.185277 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:46.185414 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:46.185402 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:46.185465 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:46.185456 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert podName:d8e714b0-eddd-43c6-9e24-c61be40fa7f2 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:46.685439783 +0000 UTC m=+141.616818667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sfp7r" (UID: "d8e714b0-eddd-43c6-9e24-c61be40fa7f2") : secret "networking-console-plugin-cert" not found Apr 22 14:17:46.185844 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.185826 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:46.578161 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.578137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9x8pf_1dfd7d57-a9b2-4910-82a6-1e9bf8576804/dns-node-resolver/0.log" Apr 22 14:17:46.689298 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.689270 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:46.689402 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:46.689381 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:46.689436 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:46.689431 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert podName:d8e714b0-eddd-43c6-9e24-c61be40fa7f2 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:47.689418093 +0000 UTC m=+142.620796973 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sfp7r" (UID: "d8e714b0-eddd-43c6-9e24-c61be40fa7f2") : secret "networking-console-plugin-cert" not found Apr 22 14:17:46.977942 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.977917 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/1.log" Apr 22 14:17:46.978303 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:46.978235 2575 scope.go:117] "RemoveContainer" containerID="5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7" Apr 22 14:17:46.978416 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:46.978396 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2lphb_openshift-console-operator(ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" podUID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" Apr 22 14:17:47.570555 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:47.570529 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-97p4n_058163d3-0e8a-40f7-aaa3-382fc9d4f5d4/node-ca/0.log" Apr 22 14:17:47.697211 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:47.697181 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:47.697362 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:47.697324 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:47.697411 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:47.697399 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert podName:d8e714b0-eddd-43c6-9e24-c61be40fa7f2 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:49.697384413 +0000 UTC m=+144.628763299 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sfp7r" (UID: "d8e714b0-eddd-43c6-9e24-c61be40fa7f2") : secret "networking-console-plugin-cert" not found Apr 22 14:17:48.907047 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:48.907009 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:48.907404 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:48.907105 2575 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 22 14:17:48.907404 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:48.907159 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls podName:bd3338c3-d102-49e4-905b-c457dea46629 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:56.907146496 +0000 UTC m=+151.838525377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-gcj4r" (UID: "bd3338c3-d102-49e4-905b-c457dea46629") : secret "samples-operator-tls" not found Apr 22 14:17:49.209183 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:49.209108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:49.209314 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:49.209251 2575 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:49.209314 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:49.209273 2575 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b48cf5d46-hdtb2: secret "image-registry-tls" not found Apr 22 14:17:49.209386 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:49.209330 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls podName:c1963949-20fe-4269-990d-7aa35e49f968 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:57.209311948 +0000 UTC m=+152.140690861 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls") pod "image-registry-6b48cf5d46-hdtb2" (UID: "c1963949-20fe-4269-990d-7aa35e49f968") : secret "image-registry-tls" not found Apr 22 14:17:49.713498 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:49.713467 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:49.713647 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:49.713615 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:49.713708 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:49.713694 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert podName:d8e714b0-eddd-43c6-9e24-c61be40fa7f2 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:53.713674053 +0000 UTC m=+148.645052934 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sfp7r" (UID: "d8e714b0-eddd-43c6-9e24-c61be40fa7f2") : secret "networking-console-plugin-cert" not found Apr 22 14:17:51.736601 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:51.736569 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:51.736601 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:51.736608 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:17:51.736996 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:51.736924 2575 scope.go:117] "RemoveContainer" containerID="5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7" Apr 22 14:17:51.737077 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:51.737061 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-2lphb_openshift-console-operator(ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" podUID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" Apr 22 14:17:53.738968 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:53.738936 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:17:53.739316 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:53.739076 2575 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 14:17:53.739316 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:17:53.739144 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert podName:d8e714b0-eddd-43c6-9e24-c61be40fa7f2 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:01.73912904 +0000 UTC m=+156.670507920 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-sfp7r" (UID: "d8e714b0-eddd-43c6-9e24-c61be40fa7f2") : secret "networking-console-plugin-cert" not found Apr 22 14:17:56.963410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:56.963367 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:56.965818 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:56.965792 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/bd3338c3-d102-49e4-905b-c457dea46629-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-gcj4r\" (UID: \"bd3338c3-d102-49e4-905b-c457dea46629\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:57.035745 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:57.035724 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" Apr 22 14:17:57.151006 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:57.150981 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r"] Apr 22 14:17:57.266152 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:57.266099 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:57.268183 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:57.268156 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"image-registry-6b48cf5d46-hdtb2\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:57.353806 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:57.353783 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:57.469394 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:57.469362 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b48cf5d46-hdtb2"] Apr 22 14:17:57.472438 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:17:57.472413 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc1963949_20fe_4269_990d_7aa35e49f968.slice/crio-fda88d73f9964c226d4a987d15e4a6804fda16141f60b0f2d7da0a878f034a44 WatchSource:0}: Error finding container fda88d73f9964c226d4a987d15e4a6804fda16141f60b0f2d7da0a878f034a44: Status 404 returned error can't find the container with id fda88d73f9964c226d4a987d15e4a6804fda16141f60b0f2d7da0a878f034a44 Apr 22 14:17:58.008606 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:58.008564 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" event={"ID":"c1963949-20fe-4269-990d-7aa35e49f968","Type":"ContainerStarted","Data":"143909fe1856a11f3c29871729274ff75a41848f51f343d45a4fa1635660cbe2"} Apr 22 14:17:58.008606 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:58.008609 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" event={"ID":"c1963949-20fe-4269-990d-7aa35e49f968","Type":"ContainerStarted","Data":"fda88d73f9964c226d4a987d15e4a6804fda16141f60b0f2d7da0a878f034a44"} Apr 22 14:17:58.009111 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:58.008683 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:17:58.009656 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:58.009633 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" event={"ID":"bd3338c3-d102-49e4-905b-c457dea46629","Type":"ContainerStarted","Data":"6d9c2bc92a54dfa45edb7396db870de873b4254e5ba7205dbc34e240a5bfa56a"} Apr 22 14:17:58.030648 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:58.030606 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" podStartSLOduration=17.030593774 podStartE2EDuration="17.030593774s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:58.029649359 +0000 UTC m=+152.961028273" watchObservedRunningTime="2026-04-22 14:17:58.030593774 +0000 UTC m=+152.961972678" Apr 22 14:17:59.013451 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:59.013412 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" event={"ID":"bd3338c3-d102-49e4-905b-c457dea46629","Type":"ContainerStarted","Data":"556d4648372e5cca9db295fc366be8fbeefa9b36a304c0b8185dd1c766abef84"} Apr 22 14:17:59.013898 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:59.013458 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" event={"ID":"bd3338c3-d102-49e4-905b-c457dea46629","Type":"ContainerStarted","Data":"29f36a6a64b26c7574a7b7195ea1e91436b83b07d0abd7853d61cbdeaddf3d1a"} Apr 22 14:17:59.033641 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:17:59.033603 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-gcj4r" podStartSLOduration=16.686904598 podStartE2EDuration="18.033591168s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:17:57.188982043 +0000 UTC m=+152.120360924" lastFinishedPulling="2026-04-22 14:17:58.535668603 +0000 UTC m=+153.467047494" observedRunningTime="2026-04-22 14:17:59.032444091 +0000 UTC m=+153.963822988" watchObservedRunningTime="2026-04-22 14:17:59.033591168 +0000 UTC m=+153.964970072" Apr 22 14:18:00.990581 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:00.990544 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-l4vdv" podUID="b515945c-4a63-4512-9132-79ffc9f58ef0" Apr 22 14:18:01.008920 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:01.008895 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-hq58r" podUID="9971a5a9-34ef-4f3c-9183-340e4c5fde1c" Apr 22 14:18:01.017651 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:01.017633 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l4vdv" Apr 22 14:18:01.800809 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:01.800727 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:18:01.803088 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:01.803053 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/d8e714b0-eddd-43c6-9e24-c61be40fa7f2-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-sfp7r\" (UID: \"d8e714b0-eddd-43c6-9e24-c61be40fa7f2\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:18:01.904429 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:01.904390 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" Apr 22 14:18:02.022663 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:02.022631 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r"] Apr 22 14:18:02.026365 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:02.026336 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd8e714b0_eddd_43c6_9e24_c61be40fa7f2.slice/crio-53edd1419e4dd6f89e7457f5fe51a70adf03671a9701975bc0899b16a2ad3781 WatchSource:0}: Error finding container 53edd1419e4dd6f89e7457f5fe51a70adf03671a9701975bc0899b16a2ad3781: Status 404 returned error can't find the container with id 53edd1419e4dd6f89e7457f5fe51a70adf03671a9701975bc0899b16a2ad3781 Apr 22 14:18:02.557868 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:02.557838 2575 scope.go:117] "RemoveContainer" containerID="5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7" Apr 22 14:18:02.570426 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:02.570400 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-swv2n" podUID="1faf2ada-1177-442f-9ee9-4ecd9697e349" Apr 22 14:18:03.023904 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.023883 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:18:03.024244 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.024230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/1.log" Apr 22 14:18:03.024295 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.024265 2575 generic.go:358] "Generic (PLEG): container finished" podID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" containerID="53d1abcef80803e2174399770ace0137620f424f08e9dd936bf811a00dddacc6" exitCode=255 Apr 22 14:18:03.024356 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.024337 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" event={"ID":"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa","Type":"ContainerDied","Data":"53d1abcef80803e2174399770ace0137620f424f08e9dd936bf811a00dddacc6"} Apr 22 14:18:03.024409 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.024380 2575 scope.go:117] "RemoveContainer" containerID="5261dd744ffd5e22c84a30180220af5e5421720565ca598661c0d3527ef08fe7" Apr 22 14:18:03.024681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.024664 2575 scope.go:117] "RemoveContainer" containerID="53d1abcef80803e2174399770ace0137620f424f08e9dd936bf811a00dddacc6" Apr 22 14:18:03.024891 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:03.024873 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2lphb_openshift-console-operator(ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" podUID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" Apr 22 14:18:03.028042 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.028021 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" event={"ID":"d8e714b0-eddd-43c6-9e24-c61be40fa7f2","Type":"ContainerStarted","Data":"1272215ecd9db7558d3f87da84b918d85a046430b7d8489d488dac94dbae45d6"} Apr 22 14:18:03.028148 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.028047 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" event={"ID":"d8e714b0-eddd-43c6-9e24-c61be40fa7f2","Type":"ContainerStarted","Data":"53edd1419e4dd6f89e7457f5fe51a70adf03671a9701975bc0899b16a2ad3781"} Apr 22 14:18:03.075977 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:03.075932 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-sfp7r" podStartSLOduration=17.175427529 podStartE2EDuration="18.075921881s" podCreationTimestamp="2026-04-22 14:17:45 +0000 UTC" firstStartedPulling="2026-04-22 14:18:02.028071463 +0000 UTC m=+156.959450345" lastFinishedPulling="2026-04-22 14:18:02.928565803 +0000 UTC m=+157.859944697" observedRunningTime="2026-04-22 14:18:03.075644401 +0000 UTC m=+158.007023312" watchObservedRunningTime="2026-04-22 14:18:03.075921881 +0000 UTC m=+158.007300827" Apr 22 14:18:04.031892 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:04.031863 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:18:05.927911 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:05.927824 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:18:05.927911 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:05.927878 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:18:05.930159 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:05.930132 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b515945c-4a63-4512-9132-79ffc9f58ef0-metrics-tls\") pod \"dns-default-l4vdv\" (UID: \"b515945c-4a63-4512-9132-79ffc9f58ef0\") " pod="openshift-dns/dns-default-l4vdv" Apr 22 14:18:05.930276 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:05.930186 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9971a5a9-34ef-4f3c-9183-340e4c5fde1c-cert\") pod \"ingress-canary-hq58r\" (UID: \"9971a5a9-34ef-4f3c-9183-340e4c5fde1c\") " pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:18:06.121943 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:06.121916 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fpnmw\"" Apr 22 14:18:06.129677 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:06.129661 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-l4vdv" Apr 22 14:18:06.245929 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:06.245878 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-l4vdv"] Apr 22 14:18:06.249179 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:06.249153 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb515945c_4a63_4512_9132_79ffc9f58ef0.slice/crio-6872c6c9920ede075c576e9fdb78ad8dcb6b6b8a19c2724bdf761d1f368a85ab WatchSource:0}: Error finding container 6872c6c9920ede075c576e9fdb78ad8dcb6b6b8a19c2724bdf761d1f368a85ab: Status 404 returned error can't find the container with id 6872c6c9920ede075c576e9fdb78ad8dcb6b6b8a19c2724bdf761d1f368a85ab Apr 22 14:18:07.040073 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:07.040034 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l4vdv" event={"ID":"b515945c-4a63-4512-9132-79ffc9f58ef0","Type":"ContainerStarted","Data":"6872c6c9920ede075c576e9fdb78ad8dcb6b6b8a19c2724bdf761d1f368a85ab"} Apr 22 14:18:08.043898 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:08.043860 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l4vdv" event={"ID":"b515945c-4a63-4512-9132-79ffc9f58ef0","Type":"ContainerStarted","Data":"f4031898a90813eafba6de3cbfa8207572365ea99b63dd6b45e6fb44816a25c7"} Apr 22 14:18:08.043898 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:08.043894 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-l4vdv" event={"ID":"b515945c-4a63-4512-9132-79ffc9f58ef0","Type":"ContainerStarted","Data":"1c04980f4611e2a736b36ec57c8077ea37ead0bbb044bceb593ff21892e63444"} Apr 22 14:18:08.044373 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:08.044020 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-l4vdv" Apr 22 14:18:08.064519 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:08.064470 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-l4vdv" podStartSLOduration=129.857625463 podStartE2EDuration="2m11.064459504s" podCreationTimestamp="2026-04-22 14:15:57 +0000 UTC" firstStartedPulling="2026-04-22 14:18:06.250970439 +0000 UTC m=+161.182349320" lastFinishedPulling="2026-04-22 14:18:07.45780448 +0000 UTC m=+162.389183361" observedRunningTime="2026-04-22 14:18:08.063194138 +0000 UTC m=+162.994573050" watchObservedRunningTime="2026-04-22 14:18:08.064459504 +0000 UTC m=+162.995838439" Apr 22 14:18:10.924657 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.924623 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b48cf5d46-hdtb2"] Apr 22 14:18:10.947374 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.947350 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-tmlr9"] Apr 22 14:18:10.952582 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.952561 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:10.956797 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.956735 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:18:10.956981 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.956964 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:18:10.957026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.957013 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:18:10.957350 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.957313 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:18:10.957728 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.957707 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-7qqgh\"" Apr 22 14:18:10.975562 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:10.975543 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tmlr9"] Apr 22 14:18:11.021368 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.021347 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54d7ffcdcc-8rpqd"] Apr 22 14:18:11.024242 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.024226 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.055974 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.055955 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54d7ffcdcc-8rpqd"] Apr 22 14:18:11.064288 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.064261 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp9fs\" (UniqueName: \"kubernetes.io/projected/925cc614-5f91-4c68-af91-1ddc2bac16bc-kube-api-access-rp9fs\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.064389 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.064297 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/925cc614-5f91-4c68-af91-1ddc2bac16bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.064389 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.064333 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/925cc614-5f91-4c68-af91-1ddc2bac16bc-crio-socket\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.064469 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.064384 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/925cc614-5f91-4c68-af91-1ddc2bac16bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.064511 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.064468 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/925cc614-5f91-4c68-af91-1ddc2bac16bc-data-volume\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165196 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165172 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-bound-sa-token\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165202 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86a84579-b5c4-4078-a90a-5f4a6668d1a0-trusted-ca\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165221 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86a84579-b5c4-4078-a90a-5f4a6668d1a0-registry-certificates\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165244 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86a84579-b5c4-4078-a90a-5f4a6668d1a0-installation-pull-secrets\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165383 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165312 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-registry-tls\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165383 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165352 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp9fs\" (UniqueName: \"kubernetes.io/projected/925cc614-5f91-4c68-af91-1ddc2bac16bc-kube-api-access-rp9fs\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165449 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165383 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86a84579-b5c4-4078-a90a-5f4a6668d1a0-image-registry-private-configuration\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165449 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165401 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/925cc614-5f91-4c68-af91-1ddc2bac16bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165449 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165439 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcwh5\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-kube-api-access-fcwh5\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165543 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165457 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/925cc614-5f91-4c68-af91-1ddc2bac16bc-crio-socket\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165543 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165476 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86a84579-b5c4-4078-a90a-5f4a6668d1a0-ca-trust-extracted\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.165543 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165497 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/925cc614-5f91-4c68-af91-1ddc2bac16bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165543 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165532 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/925cc614-5f91-4c68-af91-1ddc2bac16bc-data-volume\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165667 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165635 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/925cc614-5f91-4c68-af91-1ddc2bac16bc-crio-socket\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165847 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165832 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/925cc614-5f91-4c68-af91-1ddc2bac16bc-data-volume\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.165956 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.165942 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/925cc614-5f91-4c68-af91-1ddc2bac16bc-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.167775 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.167761 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/925cc614-5f91-4c68-af91-1ddc2bac16bc-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.185620 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.185568 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp9fs\" (UniqueName: \"kubernetes.io/projected/925cc614-5f91-4c68-af91-1ddc2bac16bc-kube-api-access-rp9fs\") pod \"insights-runtime-extractor-tmlr9\" (UID: \"925cc614-5f91-4c68-af91-1ddc2bac16bc\") " pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.261089 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.261064 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-tmlr9" Apr 22 14:18:11.266844 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.266822 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86a84579-b5c4-4078-a90a-5f4a6668d1a0-image-registry-private-configuration\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.266941 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.266859 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcwh5\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-kube-api-access-fcwh5\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.266941 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.266891 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86a84579-b5c4-4078-a90a-5f4a6668d1a0-ca-trust-extracted\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.266941 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.266933 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-bound-sa-token\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.267097 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.266967 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86a84579-b5c4-4078-a90a-5f4a6668d1a0-trusted-ca\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.267097 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.266994 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86a84579-b5c4-4078-a90a-5f4a6668d1a0-registry-certificates\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.267097 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.267032 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86a84579-b5c4-4078-a90a-5f4a6668d1a0-installation-pull-secrets\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.267097 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.267075 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-registry-tls\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.267385 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.267337 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/86a84579-b5c4-4078-a90a-5f4a6668d1a0-ca-trust-extracted\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.268078 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.268051 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/86a84579-b5c4-4078-a90a-5f4a6668d1a0-registry-certificates\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.268174 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.268099 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/86a84579-b5c4-4078-a90a-5f4a6668d1a0-trusted-ca\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.269684 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.269662 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/86a84579-b5c4-4078-a90a-5f4a6668d1a0-installation-pull-secrets\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.269875 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.269849 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-registry-tls\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.270088 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.270069 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/86a84579-b5c4-4078-a90a-5f4a6668d1a0-image-registry-private-configuration\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.277832 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.277793 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-bound-sa-token\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.277932 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.277916 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcwh5\" (UniqueName: \"kubernetes.io/projected/86a84579-b5c4-4078-a90a-5f4a6668d1a0-kube-api-access-fcwh5\") pod \"image-registry-54d7ffcdcc-8rpqd\" (UID: \"86a84579-b5c4-4078-a90a-5f4a6668d1a0\") " pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.333109 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.333053 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:11.379199 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.379171 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-tmlr9"] Apr 22 14:18:11.382708 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:11.382676 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod925cc614_5f91_4c68_af91_1ddc2bac16bc.slice/crio-0fdf09ee918e778333d53f4717266abb754ce3be3c1ee9673161ffe177edb300 WatchSource:0}: Error finding container 0fdf09ee918e778333d53f4717266abb754ce3be3c1ee9673161ffe177edb300: Status 404 returned error can't find the container with id 0fdf09ee918e778333d53f4717266abb754ce3be3c1ee9673161ffe177edb300 Apr 22 14:18:11.464652 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.464624 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54d7ffcdcc-8rpqd"] Apr 22 14:18:11.468294 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:11.468260 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86a84579_b5c4_4078_a90a_5f4a6668d1a0.slice/crio-eeb1c6d0e3341743802e5775f62b6ede2f4b3fc5b9aa8b72f33f483c68a85487 WatchSource:0}: Error finding container eeb1c6d0e3341743802e5775f62b6ede2f4b3fc5b9aa8b72f33f483c68a85487: Status 404 returned error can't find the container with id eeb1c6d0e3341743802e5775f62b6ede2f4b3fc5b9aa8b72f33f483c68a85487 Apr 22 14:18:11.736446 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.736376 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:18:11.736446 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.736419 2575 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:18:11.736878 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:11.736863 2575 scope.go:117] "RemoveContainer" containerID="53d1abcef80803e2174399770ace0137620f424f08e9dd936bf811a00dddacc6" Apr 22 14:18:11.737117 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:11.737097 2575 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-2lphb_openshift-console-operator(ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa)\"" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" podUID="ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa" Apr 22 14:18:12.054133 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.054104 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" event={"ID":"86a84579-b5c4-4078-a90a-5f4a6668d1a0","Type":"ContainerStarted","Data":"47b88d896d37525ad635baed91ef5cd87686887930284ee4aba958c1158bdcae"} Apr 22 14:18:12.054530 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.054135 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" event={"ID":"86a84579-b5c4-4078-a90a-5f4a6668d1a0","Type":"ContainerStarted","Data":"eeb1c6d0e3341743802e5775f62b6ede2f4b3fc5b9aa8b72f33f483c68a85487"} Apr 22 14:18:12.054530 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.054189 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:12.055599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.055581 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tmlr9" event={"ID":"925cc614-5f91-4c68-af91-1ddc2bac16bc","Type":"ContainerStarted","Data":"9c5aff9104d79210b00c8fe1f127b8bd78eabff0b850343679f310eaf5c36788"} Apr 22 14:18:12.055683 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.055603 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tmlr9" event={"ID":"925cc614-5f91-4c68-af91-1ddc2bac16bc","Type":"ContainerStarted","Data":"9ff858a26b98876c11e6ecbe0c7288ec923ad28315b81de46d69e2ff4aa86a45"} Apr 22 14:18:12.055683 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.055613 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tmlr9" event={"ID":"925cc614-5f91-4c68-af91-1ddc2bac16bc","Type":"ContainerStarted","Data":"0fdf09ee918e778333d53f4717266abb754ce3be3c1ee9673161ffe177edb300"} Apr 22 14:18:12.074603 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.074565 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" podStartSLOduration=2.074553496 podStartE2EDuration="2.074553496s" podCreationTimestamp="2026-04-22 14:18:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:12.073503567 +0000 UTC m=+167.004882485" watchObservedRunningTime="2026-04-22 14:18:12.074553496 +0000 UTC m=+167.005932433" Apr 22 14:18:12.556825 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.556786 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:18:12.559869 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.559846 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-slf7l\"" Apr 22 14:18:12.567410 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.567385 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-hq58r" Apr 22 14:18:12.709967 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:12.709939 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-hq58r"] Apr 22 14:18:12.712902 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:12.712875 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9971a5a9_34ef_4f3c_9183_340e4c5fde1c.slice/crio-fb5984ef357fb11e0c70a2b8718b40a4c99c1c9ae325543d99f9b58d405776e6 WatchSource:0}: Error finding container fb5984ef357fb11e0c70a2b8718b40a4c99c1c9ae325543d99f9b58d405776e6: Status 404 returned error can't find the container with id fb5984ef357fb11e0c70a2b8718b40a4c99c1c9ae325543d99f9b58d405776e6 Apr 22 14:18:13.061269 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:13.061216 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hq58r" event={"ID":"9971a5a9-34ef-4f3c-9183-340e4c5fde1c","Type":"ContainerStarted","Data":"fb5984ef357fb11e0c70a2b8718b40a4c99c1c9ae325543d99f9b58d405776e6"} Apr 22 14:18:13.557737 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:13.557701 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:18:14.066397 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:14.066363 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-tmlr9" event={"ID":"925cc614-5f91-4c68-af91-1ddc2bac16bc","Type":"ContainerStarted","Data":"95a8729c94f8602af7942c03ea68c1aa90695ef951382d4ceb6f6c47e3ad6d1c"} Apr 22 14:18:15.070691 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:15.070659 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-hq58r" event={"ID":"9971a5a9-34ef-4f3c-9183-340e4c5fde1c","Type":"ContainerStarted","Data":"85898c1c8c4c84a6b6ebe1de1ef3fbcee898e32f3fa82e4add263741efb87376"} Apr 22 14:18:15.094220 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:15.094178 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-hq58r" podStartSLOduration=136.407708165 podStartE2EDuration="2m18.094164181s" podCreationTimestamp="2026-04-22 14:15:57 +0000 UTC" firstStartedPulling="2026-04-22 14:18:12.714942854 +0000 UTC m=+167.646321735" lastFinishedPulling="2026-04-22 14:18:14.401398869 +0000 UTC m=+169.332777751" observedRunningTime="2026-04-22 14:18:15.093383817 +0000 UTC m=+170.024762717" watchObservedRunningTime="2026-04-22 14:18:15.094164181 +0000 UTC m=+170.025543083" Apr 22 14:18:15.094674 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:15.094649 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-tmlr9" podStartSLOduration=3.299021583 podStartE2EDuration="5.094639298s" podCreationTimestamp="2026-04-22 14:18:10 +0000 UTC" firstStartedPulling="2026-04-22 14:18:11.461248059 +0000 UTC m=+166.392626940" lastFinishedPulling="2026-04-22 14:18:13.256865761 +0000 UTC m=+168.188244655" observedRunningTime="2026-04-22 14:18:14.093623639 +0000 UTC m=+169.025002539" watchObservedRunningTime="2026-04-22 14:18:15.094639298 +0000 UTC m=+170.026018204" Apr 22 14:18:17.624682 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.624648 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-xvpq6"] Apr 22 14:18:17.629374 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.629359 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.633602 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.633579 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 14:18:17.633707 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.633614 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:18:17.635481 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.634509 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:17.635481 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.634932 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-4n4p5\"" Apr 22 14:18:17.635481 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.635354 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 14:18:17.636856 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.636831 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:18:17.640666 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.640640 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-xvpq6"] Apr 22 14:18:17.814307 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.814272 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/193c0bd5-08c3-4fec-a639-8c49c3979a84-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.814442 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.814316 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/193c0bd5-08c3-4fec-a639-8c49c3979a84-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.814442 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.814351 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/193c0bd5-08c3-4fec-a639-8c49c3979a84-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.814442 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.814400 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qg9j\" (UniqueName: \"kubernetes.io/projected/193c0bd5-08c3-4fec-a639-8c49c3979a84-kube-api-access-9qg9j\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.914705 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.914673 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/193c0bd5-08c3-4fec-a639-8c49c3979a84-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.914839 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.914713 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/193c0bd5-08c3-4fec-a639-8c49c3979a84-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.914839 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.914765 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/193c0bd5-08c3-4fec-a639-8c49c3979a84-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.914839 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.914795 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9qg9j\" (UniqueName: \"kubernetes.io/projected/193c0bd5-08c3-4fec-a639-8c49c3979a84-kube-api-access-9qg9j\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.915335 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.915317 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/193c0bd5-08c3-4fec-a639-8c49c3979a84-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.917060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.917037 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/193c0bd5-08c3-4fec-a639-8c49c3979a84-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.917148 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.917110 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/193c0bd5-08c3-4fec-a639-8c49c3979a84-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.924431 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.924409 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qg9j\" (UniqueName: \"kubernetes.io/projected/193c0bd5-08c3-4fec-a639-8c49c3979a84-kube-api-access-9qg9j\") pod \"prometheus-operator-5676c8c784-xvpq6\" (UID: \"193c0bd5-08c3-4fec-a639-8c49c3979a84\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:17.940228 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:17.940208 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" Apr 22 14:18:18.047905 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:18.047879 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-l4vdv" Apr 22 14:18:18.060110 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:18.060089 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-xvpq6"] Apr 22 14:18:18.062819 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:18.062798 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod193c0bd5_08c3_4fec_a639_8c49c3979a84.slice/crio-770a0e4f410a32e202c767ca567caa96dc5996dd7734a0b3d73696566c1c46d8 WatchSource:0}: Error finding container 770a0e4f410a32e202c767ca567caa96dc5996dd7734a0b3d73696566c1c46d8: Status 404 returned error can't find the container with id 770a0e4f410a32e202c767ca567caa96dc5996dd7734a0b3d73696566c1c46d8 Apr 22 14:18:18.079172 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:18.079150 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" event={"ID":"193c0bd5-08c3-4fec-a639-8c49c3979a84","Type":"ContainerStarted","Data":"770a0e4f410a32e202c767ca567caa96dc5996dd7734a0b3d73696566c1c46d8"} Apr 22 14:18:20.085888 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:20.085840 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" event={"ID":"193c0bd5-08c3-4fec-a639-8c49c3979a84","Type":"ContainerStarted","Data":"7bd409821558bd42d53c84e3eb92289bca077ac16ad64b9cbed26855781d8bfe"} Apr 22 14:18:20.085888 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:20.085893 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" event={"ID":"193c0bd5-08c3-4fec-a639-8c49c3979a84","Type":"ContainerStarted","Data":"3fadf33752f7b9b183101d67f4bd8bbd57c69e15daf8ce9e221088bd598c28b7"} Apr 22 14:18:20.103553 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:20.103511 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-xvpq6" podStartSLOduration=1.549946583 podStartE2EDuration="3.103498376s" podCreationTimestamp="2026-04-22 14:18:17 +0000 UTC" firstStartedPulling="2026-04-22 14:18:18.064610714 +0000 UTC m=+172.995989595" lastFinishedPulling="2026-04-22 14:18:19.618162507 +0000 UTC m=+174.549541388" observedRunningTime="2026-04-22 14:18:20.102569351 +0000 UTC m=+175.033948255" watchObservedRunningTime="2026-04-22 14:18:20.103498376 +0000 UTC m=+175.034877279" Apr 22 14:18:20.929914 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:20.929888 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:18:22.008512 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.008479 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hbd7w"] Apr 22 14:18:22.011814 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.011795 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.014831 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.014811 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 22 14:18:22.015896 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.015881 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-jcgfm\"" Apr 22 14:18:22.017594 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.017574 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 22 14:18:22.017772 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.017733 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 22 14:18:22.029619 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.029599 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hbd7w"] Apr 22 14:18:22.036111 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.036091 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-5sxqr"] Apr 22 14:18:22.042897 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.042874 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.042991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.042917 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.042991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.042977 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.043082 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.042985 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.043082 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.043006 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vp4tn\" (UniqueName: \"kubernetes.io/projected/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-api-access-vp4tn\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.043082 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.043026 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d9df1470-48cd-4fb5-9710-be943e19f26c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.043082 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.043043 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9df1470-48cd-4fb5-9710-be943e19f26c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.045895 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.045873 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-wnzx7\"" Apr 22 14:18:22.045972 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.045872 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:22.045972 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.045938 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:22.046378 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.046363 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:22.144162 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144141 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-tls\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144274 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144180 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.144274 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144206 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-sys\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144274 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144256 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vp4tn\" (UniqueName: \"kubernetes.io/projected/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-api-access-vp4tn\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.144443 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144330 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d9df1470-48cd-4fb5-9710-be943e19f26c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.144443 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144357 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9df1470-48cd-4fb5-9710-be943e19f26c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.144443 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144397 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-textfile\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144443 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144424 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-root\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144658 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144450 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7mww\" (UniqueName: \"kubernetes.io/projected/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-kube-api-access-x7mww\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144658 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144475 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-metrics-client-ca\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144658 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144527 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.144658 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144588 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144658 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144624 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144658 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144651 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-wtmp\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.144972 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144677 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.144972 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.144715 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/d9df1470-48cd-4fb5-9710-be943e19f26c-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.145174 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.145151 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d9df1470-48cd-4fb5-9710-be943e19f26c-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.145284 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.145268 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.146619 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.146599 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.147215 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.147194 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.154842 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.154823 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vp4tn\" (UniqueName: \"kubernetes.io/projected/d9df1470-48cd-4fb5-9710-be943e19f26c-kube-api-access-vp4tn\") pod \"kube-state-metrics-69db897b98-hbd7w\" (UID: \"d9df1470-48cd-4fb5-9710-be943e19f26c\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.245402 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245380 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-textfile\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245486 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245405 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-root\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245486 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245421 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7mww\" (UniqueName: \"kubernetes.io/projected/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-kube-api-access-x7mww\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245486 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245437 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-metrics-client-ca\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245486 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245462 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245486 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245483 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245485 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-root\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245506 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-wtmp\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245539 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-tls\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245565 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-sys\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245642 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-sys\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245926 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:22.245713 2575 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 22 14:18:22.245926 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245734 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-wtmp\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245926 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.245787 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-textfile\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.245926 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:18:22.245805 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-tls podName:e0a1aa13-8dd5-4a73-abee-ccd132aef2c4 nodeName:}" failed. No retries permitted until 2026-04-22 14:18:22.745786847 +0000 UTC m=+177.677165738 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-tls") pod "node-exporter-5sxqr" (UID: "e0a1aa13-8dd5-4a73-abee-ccd132aef2c4") : secret "node-exporter-tls" not found Apr 22 14:18:22.246084 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.246038 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-metrics-client-ca\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.246176 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.246159 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-accelerators-collector-config\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.247474 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.247459 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.254437 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.254412 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7mww\" (UniqueName: \"kubernetes.io/projected/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-kube-api-access-x7mww\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.320712 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.320637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" Apr 22 14:18:22.441950 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.441927 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-hbd7w"] Apr 22 14:18:22.444270 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:22.444244 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9df1470_48cd_4fb5_9710_be943e19f26c.slice/crio-b0aa5598082e25aa15594970557733c701fa5002e4e74379a8121f9a2f8fef15 WatchSource:0}: Error finding container b0aa5598082e25aa15594970557733c701fa5002e4e74379a8121f9a2f8fef15: Status 404 returned error can't find the container with id b0aa5598082e25aa15594970557733c701fa5002e4e74379a8121f9a2f8fef15 Apr 22 14:18:22.749112 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.749082 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-tls\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.751215 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.751189 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/e0a1aa13-8dd5-4a73-abee-ccd132aef2c4-node-exporter-tls\") pod \"node-exporter-5sxqr\" (UID: \"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4\") " pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.951176 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:22.951145 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-5sxqr" Apr 22 14:18:22.960449 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:22.960424 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a1aa13_8dd5_4a73_abee_ccd132aef2c4.slice/crio-e25ff2efc7b810dd2cdb0f94aa1b6e0fc3d288e13efa748edc563be153cc7285 WatchSource:0}: Error finding container e25ff2efc7b810dd2cdb0f94aa1b6e0fc3d288e13efa748edc563be153cc7285: Status 404 returned error can't find the container with id e25ff2efc7b810dd2cdb0f94aa1b6e0fc3d288e13efa748edc563be153cc7285 Apr 22 14:18:23.095090 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:23.095011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sxqr" event={"ID":"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4","Type":"ContainerStarted","Data":"e25ff2efc7b810dd2cdb0f94aa1b6e0fc3d288e13efa748edc563be153cc7285"} Apr 22 14:18:23.096223 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:23.096196 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" event={"ID":"d9df1470-48cd-4fb5-9710-be943e19f26c","Type":"ContainerStarted","Data":"b0aa5598082e25aa15594970557733c701fa5002e4e74379a8121f9a2f8fef15"} Apr 22 14:18:23.556927 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:23.556894 2575 scope.go:117] "RemoveContainer" containerID="53d1abcef80803e2174399770ace0137620f424f08e9dd936bf811a00dddacc6" Apr 22 14:18:24.100973 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.100952 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:18:24.101297 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.101064 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" event={"ID":"ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa","Type":"ContainerStarted","Data":"2a82163026a81996cc9cb1abf95453a4bde78818791489fff49e987abbfea94f"} Apr 22 14:18:24.101452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.101416 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:18:24.102914 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.102876 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sxqr" event={"ID":"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4","Type":"ContainerStarted","Data":"a9e710228b0d5ef6a081e2346b32491fb1cee7e6630616be49b89d45784f9890"} Apr 22 14:18:24.104998 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.104973 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" event={"ID":"d9df1470-48cd-4fb5-9710-be943e19f26c","Type":"ContainerStarted","Data":"13000548a62c7e37ba8a58c30d8532a6c86b87a0022c001271829a98cbc88ace"} Apr 22 14:18:24.105095 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.105002 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" event={"ID":"d9df1470-48cd-4fb5-9710-be943e19f26c","Type":"ContainerStarted","Data":"f9fef325927608d3d38e87c5cc8dd8fab66bd6c96cca5769a1a24dfb920075da"} Apr 22 14:18:24.105095 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.105011 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" event={"ID":"d9df1470-48cd-4fb5-9710-be943e19f26c","Type":"ContainerStarted","Data":"900772a6080aaad2f5c3eb58681e9f66871541f716fcbd9847303107b943d592"} Apr 22 14:18:24.126653 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.126607 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" podStartSLOduration=40.961979899 podStartE2EDuration="43.126591719s" podCreationTimestamp="2026-04-22 14:17:41 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.858322082 +0000 UTC m=+136.789700962" lastFinishedPulling="2026-04-22 14:17:44.022933892 +0000 UTC m=+138.954312782" observedRunningTime="2026-04-22 14:18:24.125257564 +0000 UTC m=+179.056636465" watchObservedRunningTime="2026-04-22 14:18:24.126591719 +0000 UTC m=+179.057970623" Apr 22 14:18:24.157923 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.157876 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-hbd7w" podStartSLOduration=2.03733829 podStartE2EDuration="3.157859354s" podCreationTimestamp="2026-04-22 14:18:21 +0000 UTC" firstStartedPulling="2026-04-22 14:18:22.446180373 +0000 UTC m=+177.377559259" lastFinishedPulling="2026-04-22 14:18:23.566701438 +0000 UTC m=+178.498080323" observedRunningTime="2026-04-22 14:18:24.156298348 +0000 UTC m=+179.087677275" watchObservedRunningTime="2026-04-22 14:18:24.157859354 +0000 UTC m=+179.089238258" Apr 22 14:18:24.357928 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:24.357855 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-2lphb" Apr 22 14:18:25.108887 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:25.108845 2575 generic.go:358] "Generic (PLEG): container finished" podID="e0a1aa13-8dd5-4a73-abee-ccd132aef2c4" containerID="a9e710228b0d5ef6a081e2346b32491fb1cee7e6630616be49b89d45784f9890" exitCode=0 Apr 22 14:18:25.109345 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:25.108962 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sxqr" event={"ID":"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4","Type":"ContainerDied","Data":"a9e710228b0d5ef6a081e2346b32491fb1cee7e6630616be49b89d45784f9890"} Apr 22 14:18:26.113769 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.113720 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sxqr" event={"ID":"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4","Type":"ContainerStarted","Data":"91f21327e82b5458785047ad8101ea0d349834dd2a8ff0a39e84e3c20c5c7e51"} Apr 22 14:18:26.113769 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.113774 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-5sxqr" event={"ID":"e0a1aa13-8dd5-4a73-abee-ccd132aef2c4","Type":"ContainerStarted","Data":"b3f1cf4f1a5ece5253b72f2d9357e0d13d340b5fb15c42302aaecf6983ee35d1"} Apr 22 14:18:26.141178 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.141132 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-5sxqr" podStartSLOduration=3.084425865 podStartE2EDuration="4.141119543s" podCreationTimestamp="2026-04-22 14:18:22 +0000 UTC" firstStartedPulling="2026-04-22 14:18:22.962604728 +0000 UTC m=+177.893983614" lastFinishedPulling="2026-04-22 14:18:24.019298407 +0000 UTC m=+178.950677292" observedRunningTime="2026-04-22 14:18:26.140118404 +0000 UTC m=+181.071497297" watchObservedRunningTime="2026-04-22 14:18:26.141119543 +0000 UTC m=+181.072498493" Apr 22 14:18:26.405374 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.405342 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm"] Apr 22 14:18:26.410025 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.410000 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.413008 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.412984 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-9415th80tuf4u\"" Apr 22 14:18:26.413008 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.412999 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sl5v6\"" Apr 22 14:18:26.413195 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.412989 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 22 14:18:26.413195 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.413076 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 22 14:18:26.413195 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.413089 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 22 14:18:26.413317 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.413219 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 22 14:18:26.419024 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.419006 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm"] Apr 22 14:18:26.483663 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483637 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk92m\" (UniqueName: \"kubernetes.io/projected/208bd1e8-feae-4ec2-8277-d633ae78860c-kube-api-access-dk92m\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.483804 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483685 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/208bd1e8-feae-4ec2-8277-d633ae78860c-audit-log\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.483804 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483707 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-secret-metrics-server-tls\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.483804 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/208bd1e8-feae-4ec2-8277-d633ae78860c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.483951 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483834 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/208bd1e8-feae-4ec2-8277-d633ae78860c-metrics-server-audit-profiles\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.483951 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483856 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-secret-metrics-server-client-certs\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.483951 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.483875 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-client-ca-bundle\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584162 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584135 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/208bd1e8-feae-4ec2-8277-d633ae78860c-metrics-server-audit-profiles\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584281 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584175 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-secret-metrics-server-client-certs\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584281 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584203 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-client-ca-bundle\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584391 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584316 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dk92m\" (UniqueName: \"kubernetes.io/projected/208bd1e8-feae-4ec2-8277-d633ae78860c-kube-api-access-dk92m\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584391 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584377 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/208bd1e8-feae-4ec2-8277-d633ae78860c-audit-log\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584478 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584411 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-secret-metrics-server-tls\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584478 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584442 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/208bd1e8-feae-4ec2-8277-d633ae78860c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.584843 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.584816 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/208bd1e8-feae-4ec2-8277-d633ae78860c-audit-log\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.585204 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.585183 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/208bd1e8-feae-4ec2-8277-d633ae78860c-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.585347 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.585325 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/208bd1e8-feae-4ec2-8277-d633ae78860c-metrics-server-audit-profiles\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.586890 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.586862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-client-ca-bundle\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.586964 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.586921 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-secret-metrics-server-client-certs\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.587102 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.587080 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/208bd1e8-feae-4ec2-8277-d633ae78860c-secret-metrics-server-tls\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.592681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.592657 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk92m\" (UniqueName: \"kubernetes.io/projected/208bd1e8-feae-4ec2-8277-d633ae78860c-kube-api-access-dk92m\") pod \"metrics-server-54bcf9ffdd-qxdmm\" (UID: \"208bd1e8-feae-4ec2-8277-d633ae78860c\") " pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.721930 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.721849 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:26.883101 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:26.883067 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm"] Apr 22 14:18:26.885545 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:26.885514 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod208bd1e8_feae_4ec2_8277_d633ae78860c.slice/crio-0911db57b842aed0b36adf33c5c8f67a2cfabd1567875be73296f9a98a3172c9 WatchSource:0}: Error finding container 0911db57b842aed0b36adf33c5c8f67a2cfabd1567875be73296f9a98a3172c9: Status 404 returned error can't find the container with id 0911db57b842aed0b36adf33c5c8f67a2cfabd1567875be73296f9a98a3172c9 Apr 22 14:18:27.117947 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.117867 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" event={"ID":"208bd1e8-feae-4ec2-8277-d633ae78860c","Type":"ContainerStarted","Data":"0911db57b842aed0b36adf33c5c8f67a2cfabd1567875be73296f9a98a3172c9"} Apr 22 14:18:27.168882 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.168853 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-855b7c769d-2gbh4"] Apr 22 14:18:27.173894 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.173876 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.176628 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.176607 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 22 14:18:27.176739 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.176652 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-qml5q\"" Apr 22 14:18:27.176739 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.176723 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 22 14:18:27.176739 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.176735 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 22 14:18:27.176911 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.176897 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 22 14:18:27.177175 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.177159 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 22 14:18:27.182631 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.182600 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 22 14:18:27.184369 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.184349 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-855b7c769d-2gbh4"] Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189176 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189218 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-telemeter-trusted-ca-bundle\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189251 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-metrics-client-ca\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189281 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-federate-client-tls\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189319 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-telemeter-client-tls\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189359 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-secret-telemeter-client\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189428 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfvm2\" (UniqueName: \"kubernetes.io/projected/ff041a88-ef61-414c-b1af-87cc9f897125-kube-api-access-vfvm2\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.190591 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.189496 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-serving-certs-ca-bundle\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.289878 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.289847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-serving-certs-ca-bundle\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.289878 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.289890 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290104 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.289923 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-telemeter-trusted-ca-bundle\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290104 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.289955 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-metrics-client-ca\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290104 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.289981 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-federate-client-tls\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290104 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.290020 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-telemeter-client-tls\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290317 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.290173 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-secret-telemeter-client\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290317 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.290233 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vfvm2\" (UniqueName: \"kubernetes.io/projected/ff041a88-ef61-414c-b1af-87cc9f897125-kube-api-access-vfvm2\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290812 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.290681 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-serving-certs-ca-bundle\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.290812 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.290778 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-metrics-client-ca\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.291067 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.291045 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff041a88-ef61-414c-b1af-87cc9f897125-telemeter-trusted-ca-bundle\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.292646 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.292624 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-federate-client-tls\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.292772 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.292731 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-secret-telemeter-client\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.292906 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.292892 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.292940 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.292927 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/ff041a88-ef61-414c-b1af-87cc9f897125-telemeter-client-tls\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.298971 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.298954 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfvm2\" (UniqueName: \"kubernetes.io/projected/ff041a88-ef61-414c-b1af-87cc9f897125-kube-api-access-vfvm2\") pod \"telemeter-client-855b7c769d-2gbh4\" (UID: \"ff041a88-ef61-414c-b1af-87cc9f897125\") " pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.484605 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.484565 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" Apr 22 14:18:27.621891 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:27.621785 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-855b7c769d-2gbh4"] Apr 22 14:18:27.624162 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:27.624125 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff041a88_ef61_414c_b1af_87cc9f897125.slice/crio-e79ba31fe802b6ad4fd797889c30dec4f3866b3cbe6056859c13e427b79e6a89 WatchSource:0}: Error finding container e79ba31fe802b6ad4fd797889c30dec4f3866b3cbe6056859c13e427b79e6a89: Status 404 returned error can't find the container with id e79ba31fe802b6ad4fd797889c30dec4f3866b3cbe6056859c13e427b79e6a89 Apr 22 14:18:28.125097 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:28.125058 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" event={"ID":"ff041a88-ef61-414c-b1af-87cc9f897125","Type":"ContainerStarted","Data":"e79ba31fe802b6ad4fd797889c30dec4f3866b3cbe6056859c13e427b79e6a89"} Apr 22 14:18:29.129447 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:29.129413 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" event={"ID":"208bd1e8-feae-4ec2-8277-d633ae78860c","Type":"ContainerStarted","Data":"b9eebf1e2b32ce9f1483e5278d8d2e8ac34c84855b815d2d84f3436abb87c653"} Apr 22 14:18:29.152903 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:29.152850 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" podStartSLOduration=1.470990955 podStartE2EDuration="3.152834943s" podCreationTimestamp="2026-04-22 14:18:26 +0000 UTC" firstStartedPulling="2026-04-22 14:18:26.887333349 +0000 UTC m=+181.818712235" lastFinishedPulling="2026-04-22 14:18:28.569177336 +0000 UTC m=+183.500556223" observedRunningTime="2026-04-22 14:18:29.152041304 +0000 UTC m=+184.083420210" watchObservedRunningTime="2026-04-22 14:18:29.152834943 +0000 UTC m=+184.084213849" Apr 22 14:18:30.134110 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:30.134073 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" event={"ID":"ff041a88-ef61-414c-b1af-87cc9f897125","Type":"ContainerStarted","Data":"1f74433d48f9bd2f78ab709a5bfebde9400068ebd55c7f2ab2b1a16cd4b5ee81"} Apr 22 14:18:31.138704 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:31.138665 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" event={"ID":"ff041a88-ef61-414c-b1af-87cc9f897125","Type":"ContainerStarted","Data":"808b0b7c08823dcc1bb3138d63fb7a3d01674bf0bdb3759c4ab2bfc5d811922d"} Apr 22 14:18:31.138704 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:31.138707 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" event={"ID":"ff041a88-ef61-414c-b1af-87cc9f897125","Type":"ContainerStarted","Data":"ecb9ad38c950c8f97613c8b8d8a2cf2543a972bd1a6e3addf51cf6b374e2555a"} Apr 22 14:18:31.169042 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:31.168998 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-855b7c769d-2gbh4" podStartSLOduration=1.2736456760000001 podStartE2EDuration="4.16898108s" podCreationTimestamp="2026-04-22 14:18:27 +0000 UTC" firstStartedPulling="2026-04-22 14:18:27.626279045 +0000 UTC m=+182.557657938" lastFinishedPulling="2026-04-22 14:18:30.521614454 +0000 UTC m=+185.452993342" observedRunningTime="2026-04-22 14:18:31.167577703 +0000 UTC m=+186.098956632" watchObservedRunningTime="2026-04-22 14:18:31.16898108 +0000 UTC m=+186.100359983" Apr 22 14:18:33.065877 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:33.065849 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54d7ffcdcc-8rpqd" Apr 22 14:18:35.942541 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:35.942480 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" podUID="c1963949-20fe-4269-990d-7aa35e49f968" containerName="registry" containerID="cri-o://143909fe1856a11f3c29871729274ff75a41848f51f343d45a4fa1635660cbe2" gracePeriod=30 Apr 22 14:18:36.156870 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.156837 2575 generic.go:358] "Generic (PLEG): container finished" podID="c1963949-20fe-4269-990d-7aa35e49f968" containerID="143909fe1856a11f3c29871729274ff75a41848f51f343d45a4fa1635660cbe2" exitCode=0 Apr 22 14:18:36.157030 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.156918 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" event={"ID":"c1963949-20fe-4269-990d-7aa35e49f968","Type":"ContainerDied","Data":"143909fe1856a11f3c29871729274ff75a41848f51f343d45a4fa1635660cbe2"} Apr 22 14:18:36.192147 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.192129 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:18:36.263338 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263264 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-image-registry-private-configuration\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263338 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263311 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263537 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263347 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n2kkj\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-kube-api-access-n2kkj\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263537 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263434 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-registry-certificates\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263537 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263500 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-trusted-ca\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263692 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263540 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1963949-20fe-4269-990d-7aa35e49f968-ca-trust-extracted\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263692 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263586 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-bound-sa-token\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263692 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263615 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-installation-pull-secrets\") pod \"c1963949-20fe-4269-990d-7aa35e49f968\" (UID: \"c1963949-20fe-4269-990d-7aa35e49f968\") " Apr 22 14:18:36.263970 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263904 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:36.263970 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.263925 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:36.265731 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.265703 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-kube-api-access-n2kkj" (OuterVolumeSpecName: "kube-api-access-n2kkj") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "kube-api-access-n2kkj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:36.266100 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.266063 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:36.268189 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.268165 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:36.268310 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.268291 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:36.268419 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.268395 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:36.273914 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.273889 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c1963949-20fe-4269-990d-7aa35e49f968-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c1963949-20fe-4269-990d-7aa35e49f968" (UID: "c1963949-20fe-4269-990d-7aa35e49f968"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:18:36.365206 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365174 2575 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-image-registry-private-configuration\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365206 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365202 2575 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-registry-tls\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365206 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365212 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-n2kkj\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-kube-api-access-n2kkj\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365222 2575 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-registry-certificates\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365231 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c1963949-20fe-4269-990d-7aa35e49f968-trusted-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365239 2575 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c1963949-20fe-4269-990d-7aa35e49f968-ca-trust-extracted\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365248 2575 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c1963949-20fe-4269-990d-7aa35e49f968-bound-sa-token\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:36.365456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:36.365256 2575 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c1963949-20fe-4269-990d-7aa35e49f968-installation-pull-secrets\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:18:37.164797 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:37.164765 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" event={"ID":"c1963949-20fe-4269-990d-7aa35e49f968","Type":"ContainerDied","Data":"fda88d73f9964c226d4a987d15e4a6804fda16141f60b0f2d7da0a878f034a44"} Apr 22 14:18:37.165187 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:37.164807 2575 scope.go:117] "RemoveContainer" containerID="143909fe1856a11f3c29871729274ff75a41848f51f343d45a4fa1635660cbe2" Apr 22 14:18:37.165187 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:37.164776 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b48cf5d46-hdtb2" Apr 22 14:18:37.186922 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:37.186900 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b48cf5d46-hdtb2"] Apr 22 14:18:37.193745 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:37.193725 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b48cf5d46-hdtb2"] Apr 22 14:18:37.561231 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:37.561151 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1963949-20fe-4269-990d-7aa35e49f968" path="/var/lib/kubelet/pods/c1963949-20fe-4269-990d-7aa35e49f968/volumes" Apr 22 14:18:38.451869 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.451843 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-c54bcd88d-mhntt"] Apr 22 14:18:38.452265 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.452100 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c1963949-20fe-4269-990d-7aa35e49f968" containerName="registry" Apr 22 14:18:38.452265 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.452110 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1963949-20fe-4269-990d-7aa35e49f968" containerName="registry" Apr 22 14:18:38.452265 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.452169 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="c1963949-20fe-4269-990d-7aa35e49f968" containerName="registry" Apr 22 14:18:38.456949 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.456927 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.459680 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.459655 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:18:38.460973 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.460888 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:18:38.461080 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.460984 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:18:38.461080 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.460992 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-nhxxm\"" Apr 22 14:18:38.461189 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.461089 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:18:38.461189 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.461094 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:18:38.461189 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.461094 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:18:38.461384 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.461364 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:18:38.465721 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.465700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 14:18:38.466695 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.466677 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c54bcd88d-mhntt"] Apr 22 14:18:38.480370 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480347 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-oauth-config\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.480467 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480386 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j99wc\" (UniqueName: \"kubernetes.io/projected/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-kube-api-access-j99wc\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.480467 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480443 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-oauth-serving-cert\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.480600 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480562 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-trusted-ca-bundle\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.480651 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-serving-cert\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.480651 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480634 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-config\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.480798 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.480769 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-service-ca\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.581670 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581586 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-oauth-config\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.581802 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581702 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j99wc\" (UniqueName: \"kubernetes.io/projected/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-kube-api-access-j99wc\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.581802 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581781 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-oauth-serving-cert\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.581910 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581816 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-trusted-ca-bundle\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.581910 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581847 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-serving-cert\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.581910 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581873 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-config\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.582054 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.581907 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-service-ca\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.582447 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.582427 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-oauth-serving-cert\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.582606 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.582587 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-config\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.582695 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.582674 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-trusted-ca-bundle\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.583179 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.583147 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-service-ca\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.584479 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.584454 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-oauth-config\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.584684 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.584658 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-serving-cert\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.590003 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.589974 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j99wc\" (UniqueName: \"kubernetes.io/projected/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-kube-api-access-j99wc\") pod \"console-c54bcd88d-mhntt\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.767481 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.767419 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:38.886272 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:38.886238 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-c54bcd88d-mhntt"] Apr 22 14:18:38.889302 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:18:38.889272 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode93cbaea_aa0a_40ba_aa49_0c76d3fdd6ba.slice/crio-7e4cdcccf416505384dd0df32ecb50f4a18829c5289d5be50c4f3c88f62c3fb4 WatchSource:0}: Error finding container 7e4cdcccf416505384dd0df32ecb50f4a18829c5289d5be50c4f3c88f62c3fb4: Status 404 returned error can't find the container with id 7e4cdcccf416505384dd0df32ecb50f4a18829c5289d5be50c4f3c88f62c3fb4 Apr 22 14:18:39.172281 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:39.172251 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c54bcd88d-mhntt" event={"ID":"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba","Type":"ContainerStarted","Data":"7e4cdcccf416505384dd0df32ecb50f4a18829c5289d5be50c4f3c88f62c3fb4"} Apr 22 14:18:42.182648 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:42.182610 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c54bcd88d-mhntt" event={"ID":"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba","Type":"ContainerStarted","Data":"b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37"} Apr 22 14:18:42.203015 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:42.202969 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-c54bcd88d-mhntt" podStartSLOduration=1.627407973 podStartE2EDuration="4.202954687s" podCreationTimestamp="2026-04-22 14:18:38 +0000 UTC" firstStartedPulling="2026-04-22 14:18:38.891142564 +0000 UTC m=+193.822521445" lastFinishedPulling="2026-04-22 14:18:41.466689274 +0000 UTC m=+196.398068159" observedRunningTime="2026-04-22 14:18:42.201907406 +0000 UTC m=+197.133286326" watchObservedRunningTime="2026-04-22 14:18:42.202954687 +0000 UTC m=+197.134333589" Apr 22 14:18:46.722844 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:46.722802 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:46.722844 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:46.722852 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:18:48.768506 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:48.768472 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:48.768992 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:48.768515 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:48.772845 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:48.772822 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:49.206366 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:49.206339 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:18:55.221433 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:55.221400 2575 generic.go:358] "Generic (PLEG): container finished" podID="f4c9f80e-9e6b-4087-b460-c87423e02659" containerID="844bf22e6f996cc0b45d27b384fdd248ce977901a30f8131adf925fdba09f609" exitCode=0 Apr 22 14:18:55.221811 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:55.221468 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" event={"ID":"f4c9f80e-9e6b-4087-b460-c87423e02659","Type":"ContainerDied","Data":"844bf22e6f996cc0b45d27b384fdd248ce977901a30f8131adf925fdba09f609"} Apr 22 14:18:55.221811 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:55.221738 2575 scope.go:117] "RemoveContainer" containerID="844bf22e6f996cc0b45d27b384fdd248ce977901a30f8131adf925fdba09f609" Apr 22 14:18:56.226332 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:18:56.226299 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-9zbk9" event={"ID":"f4c9f80e-9e6b-4087-b460-c87423e02659","Type":"ContainerStarted","Data":"b6ff76820a107488f2c8af17201219bb25d017118998ca4488768d49fd866abf"} Apr 22 14:19:06.727772 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:06.727730 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:19:06.731543 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:06.731524 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-54bcf9ffdd-qxdmm" Apr 22 14:19:37.335576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:37.335498 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:19:37.338162 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:37.338138 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1faf2ada-1177-442f-9ee9-4ecd9697e349-metrics-certs\") pod \"network-metrics-daemon-swv2n\" (UID: \"1faf2ada-1177-442f-9ee9-4ecd9697e349\") " pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:19:37.561082 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:37.561055 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-r76nx\"" Apr 22 14:19:37.568405 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:37.568384 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-swv2n" Apr 22 14:19:37.680775 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:37.680737 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-swv2n"] Apr 22 14:19:37.683116 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:19:37.683091 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1faf2ada_1177_442f_9ee9_4ecd9697e349.slice/crio-41dd664b6e67e33a642e1311223e4fe2d01acb909c2d763fd6e765a615f61919 WatchSource:0}: Error finding container 41dd664b6e67e33a642e1311223e4fe2d01acb909c2d763fd6e765a615f61919: Status 404 returned error can't find the container with id 41dd664b6e67e33a642e1311223e4fe2d01acb909c2d763fd6e765a615f61919 Apr 22 14:19:38.341330 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:38.341297 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-swv2n" event={"ID":"1faf2ada-1177-442f-9ee9-4ecd9697e349","Type":"ContainerStarted","Data":"41dd664b6e67e33a642e1311223e4fe2d01acb909c2d763fd6e765a615f61919"} Apr 22 14:19:39.345295 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:39.345255 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-swv2n" event={"ID":"1faf2ada-1177-442f-9ee9-4ecd9697e349","Type":"ContainerStarted","Data":"c0830d354fbda29a42dabd1e42e946db1b830d690c6c9ca590c7c9da22431593"} Apr 22 14:19:39.345295 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:39.345292 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-swv2n" event={"ID":"1faf2ada-1177-442f-9ee9-4ecd9697e349","Type":"ContainerStarted","Data":"a630a5f9dbd31ba395701bd3acf2a13ae73b58ca4a6a59e8bf63d9bf18a92e8f"} Apr 22 14:19:39.363244 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:39.363199 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-swv2n" podStartSLOduration=253.441965794 podStartE2EDuration="4m14.36318594s" podCreationTimestamp="2026-04-22 14:15:25 +0000 UTC" firstStartedPulling="2026-04-22 14:19:37.684831963 +0000 UTC m=+252.616210844" lastFinishedPulling="2026-04-22 14:19:38.606052106 +0000 UTC m=+253.537430990" observedRunningTime="2026-04-22 14:19:39.362200127 +0000 UTC m=+254.293579032" watchObservedRunningTime="2026-04-22 14:19:39.36318594 +0000 UTC m=+254.294564842" Apr 22 14:19:48.341101 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.341070 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-589f6fbd59-wpphn"] Apr 22 14:19:48.343159 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.343144 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.356804 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.356783 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-589f6fbd59-wpphn"] Apr 22 14:19:48.424565 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424529 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-config\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.424716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424573 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-oauth-config\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.424716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424595 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-service-ca\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.424716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424656 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-serving-cert\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.424716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-oauth-serving-cert\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.424716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424698 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q288v\" (UniqueName: \"kubernetes.io/projected/e42c29c3-328c-4626-a9ab-e26b2e8f4036-kube-api-access-q288v\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.424932 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.424727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-trusted-ca-bundle\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526065 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526038 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-serving-cert\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526065 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526068 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-oauth-serving-cert\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526245 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526084 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q288v\" (UniqueName: \"kubernetes.io/projected/e42c29c3-328c-4626-a9ab-e26b2e8f4036-kube-api-access-q288v\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526245 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526108 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-trusted-ca-bundle\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526245 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526131 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-config\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526383 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526305 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-oauth-config\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526383 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526344 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-service-ca\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526822 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526802 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-config\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.526924 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526862 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-oauth-serving-cert\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.527009 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.526989 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-trusted-ca-bundle\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.527060 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.527007 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-service-ca\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.528613 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.528585 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-oauth-config\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.528716 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.528685 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-serving-cert\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.535154 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.535135 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q288v\" (UniqueName: \"kubernetes.io/projected/e42c29c3-328c-4626-a9ab-e26b2e8f4036-kube-api-access-q288v\") pod \"console-589f6fbd59-wpphn\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.653542 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.653519 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:48.767411 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:48.767381 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-589f6fbd59-wpphn"] Apr 22 14:19:48.770994 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:19:48.770953 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode42c29c3_328c_4626_a9ab_e26b2e8f4036.slice/crio-1be676421cc9ea011da728dea403928efaf5fd5b7a9d06f37a7073eb74e3ab04 WatchSource:0}: Error finding container 1be676421cc9ea011da728dea403928efaf5fd5b7a9d06f37a7073eb74e3ab04: Status 404 returned error can't find the container with id 1be676421cc9ea011da728dea403928efaf5fd5b7a9d06f37a7073eb74e3ab04 Apr 22 14:19:49.374990 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:49.374957 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589f6fbd59-wpphn" event={"ID":"e42c29c3-328c-4626-a9ab-e26b2e8f4036","Type":"ContainerStarted","Data":"d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2"} Apr 22 14:19:49.374990 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:49.374993 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589f6fbd59-wpphn" event={"ID":"e42c29c3-328c-4626-a9ab-e26b2e8f4036","Type":"ContainerStarted","Data":"1be676421cc9ea011da728dea403928efaf5fd5b7a9d06f37a7073eb74e3ab04"} Apr 22 14:19:49.391985 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:49.391939 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-589f6fbd59-wpphn" podStartSLOduration=1.391921115 podStartE2EDuration="1.391921115s" podCreationTimestamp="2026-04-22 14:19:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:19:49.391656414 +0000 UTC m=+264.323035329" watchObservedRunningTime="2026-04-22 14:19:49.391921115 +0000 UTC m=+264.323300061" Apr 22 14:19:58.654122 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:58.654087 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:58.654122 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:58.654123 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:58.658906 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:58.658883 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:59.409742 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:59.409715 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:19:59.450577 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:19:59.450551 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c54bcd88d-mhntt"] Apr 22 14:20:24.470098 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.470045 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-c54bcd88d-mhntt" podUID="e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" containerName="console" containerID="cri-o://b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37" gracePeriod=15 Apr 22 14:20:24.718497 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.718470 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c54bcd88d-mhntt_e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba/console/0.log" Apr 22 14:20:24.718599 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.718533 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:20:24.891849 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.891774 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-service-ca\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.891849 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.891827 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-config\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.892026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.891862 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-trusted-ca-bundle\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.892026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.891891 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-oauth-serving-cert\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.892026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.891933 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j99wc\" (UniqueName: \"kubernetes.io/projected/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-kube-api-access-j99wc\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.892026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.891959 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-oauth-config\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.892026 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.892010 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-serving-cert\") pod \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\" (UID: \"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba\") " Apr 22 14:20:24.892265 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.892241 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-service-ca" (OuterVolumeSpecName: "service-ca") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:24.892324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.892254 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-config" (OuterVolumeSpecName: "console-config") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:24.892324 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.892283 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:24.892407 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.892344 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:24.894163 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.894138 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:20:24.894267 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.894192 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-kube-api-access-j99wc" (OuterVolumeSpecName: "kube-api-access-j99wc") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "kube-api-access-j99wc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:20:24.894267 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.894193 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" (UID: "e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:20:24.993493 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993467 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:24.993493 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993488 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-trusted-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:24.993618 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993498 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-oauth-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:24.993618 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993507 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j99wc\" (UniqueName: \"kubernetes.io/projected/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-kube-api-access-j99wc\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:24.993618 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993516 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-oauth-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:24.993618 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993525 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-console-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:24.993618 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:24.993533 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba-service-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:20:25.430861 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.430835 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c54bcd88d-mhntt_e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba/console/0.log" Apr 22 14:20:25.436577 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.436555 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-c54bcd88d-mhntt_e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba/console/0.log" Apr 22 14:20:25.438886 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.438864 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:20:25.441773 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.441736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:20:25.441857 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.441736 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:20:25.445386 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.445367 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:20:25.449230 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.449212 2575 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:20:25.479590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.479466 2575 generic.go:358] "Generic (PLEG): container finished" podID="e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" containerID="b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37" exitCode=2 Apr 22 14:20:25.479590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.479520 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-c54bcd88d-mhntt" Apr 22 14:20:25.479590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.479549 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c54bcd88d-mhntt" event={"ID":"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba","Type":"ContainerDied","Data":"b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37"} Apr 22 14:20:25.479590 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.479583 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-c54bcd88d-mhntt" event={"ID":"e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba","Type":"ContainerDied","Data":"7e4cdcccf416505384dd0df32ecb50f4a18829c5289d5be50c4f3c88f62c3fb4"} Apr 22 14:20:25.494476 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.479598 2575 scope.go:117] "RemoveContainer" containerID="b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37" Apr 22 14:20:25.494476 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.487569 2575 scope.go:117] "RemoveContainer" containerID="b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37" Apr 22 14:20:25.494476 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:20:25.487848 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37\": container with ID starting with b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37 not found: ID does not exist" containerID="b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37" Apr 22 14:20:25.494476 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.487870 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37"} err="failed to get container status \"b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37\": rpc error: code = NotFound desc = could not find container \"b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37\": container with ID starting with b8a2d942d5ac5f90abcb375469180c121f0f977cfb96f6307eeb3cf9497f8f37 not found: ID does not exist" Apr 22 14:20:25.510860 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.510838 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-c54bcd88d-mhntt"] Apr 22 14:20:25.514413 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.514396 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-c54bcd88d-mhntt"] Apr 22 14:20:25.560430 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:25.560407 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" path="/var/lib/kubelet/pods/e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba/volumes" Apr 22 14:20:27.903854 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.903817 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hxhjd"] Apr 22 14:20:27.904220 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.904119 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" containerName="console" Apr 22 14:20:27.904220 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.904132 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" containerName="console" Apr 22 14:20:27.904220 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.904186 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e93cbaea-aa0a-40ba-aa49-0c76d3fdd6ba" containerName="console" Apr 22 14:20:27.907073 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.907057 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:27.913443 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.913386 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:20:27.923801 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:27.923777 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxhjd"] Apr 22 14:20:28.016817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.016781 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d18f912-d276-4fce-b5f0-e7157304038e-original-pull-secret\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.016924 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.016842 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4d18f912-d276-4fce-b5f0-e7157304038e-kubelet-config\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.016968 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.016915 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4d18f912-d276-4fce-b5f0-e7157304038e-dbus\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.117593 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.117569 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4d18f912-d276-4fce-b5f0-e7157304038e-dbus\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.117684 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.117609 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d18f912-d276-4fce-b5f0-e7157304038e-original-pull-secret\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.117684 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.117650 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4d18f912-d276-4fce-b5f0-e7157304038e-kubelet-config\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.117787 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.117719 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4d18f912-d276-4fce-b5f0-e7157304038e-kubelet-config\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.117787 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.117781 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4d18f912-d276-4fce-b5f0-e7157304038e-dbus\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.119857 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.119835 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4d18f912-d276-4fce-b5f0-e7157304038e-original-pull-secret\") pod \"global-pull-secret-syncer-hxhjd\" (UID: \"4d18f912-d276-4fce-b5f0-e7157304038e\") " pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.215986 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.215934 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hxhjd" Apr 22 14:20:28.328230 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.328127 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hxhjd"] Apr 22 14:20:28.330634 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:20:28.330596 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4d18f912_d276_4fce_b5f0_e7157304038e.slice/crio-087ed1910b877e8c12c680fe8abaaa5ca9e717811d13278d0c92bed7ca7f522f WatchSource:0}: Error finding container 087ed1910b877e8c12c680fe8abaaa5ca9e717811d13278d0c92bed7ca7f522f: Status 404 returned error can't find the container with id 087ed1910b877e8c12c680fe8abaaa5ca9e717811d13278d0c92bed7ca7f522f Apr 22 14:20:28.332315 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.332298 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:20:28.488865 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:28.488805 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxhjd" event={"ID":"4d18f912-d276-4fce-b5f0-e7157304038e","Type":"ContainerStarted","Data":"087ed1910b877e8c12c680fe8abaaa5ca9e717811d13278d0c92bed7ca7f522f"} Apr 22 14:20:32.503019 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:32.502979 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hxhjd" event={"ID":"4d18f912-d276-4fce-b5f0-e7157304038e","Type":"ContainerStarted","Data":"a5ca8e5ac33134eb82dd5934014eeec8f1b0d70de81c2a091f6c99f0601a4b41"} Apr 22 14:20:32.519801 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:20:32.519742 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hxhjd" podStartSLOduration=1.558322939 podStartE2EDuration="5.519727358s" podCreationTimestamp="2026-04-22 14:20:27 +0000 UTC" firstStartedPulling="2026-04-22 14:20:28.332430852 +0000 UTC m=+303.263809733" lastFinishedPulling="2026-04-22 14:20:32.293835268 +0000 UTC m=+307.225214152" observedRunningTime="2026-04-22 14:20:32.518245256 +0000 UTC m=+307.449624186" watchObservedRunningTime="2026-04-22 14:20:32.519727358 +0000 UTC m=+307.451106254" Apr 22 14:21:29.478285 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.478243 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq"] Apr 22 14:21:29.483397 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.483375 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.486532 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.486509 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-marketplace\"/\"default-dockercfg-9tk2l\"" Apr 22 14:21:29.487721 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.487700 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"kube-root-ca.crt\"" Apr 22 14:21:29.487829 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.487772 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-marketplace\"/\"openshift-service-ca.crt\"" Apr 22 14:21:29.496385 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.496361 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq"] Apr 22 14:21:29.546253 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.546230 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.546370 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.546275 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4n6hb\" (UniqueName: \"kubernetes.io/projected/80f1a0aa-1449-4d0c-9e29-a2372be638af-kube-api-access-4n6hb\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.546370 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.546326 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.647362 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.647335 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.647492 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.647385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4n6hb\" (UniqueName: \"kubernetes.io/projected/80f1a0aa-1449-4d0c-9e29-a2372be638af-kube-api-access-4n6hb\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.647554 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.647504 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.647767 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.647735 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-bundle\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.647834 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.647788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-util\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.657516 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.657490 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4n6hb\" (UniqueName: \"kubernetes.io/projected/80f1a0aa-1449-4d0c-9e29-a2372be638af-kube-api-access-4n6hb\") pod \"59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.792338 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.792288 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:29.910939 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:29.910876 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq"] Apr 22 14:21:29.913641 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:21:29.913615 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80f1a0aa_1449_4d0c_9e29_a2372be638af.slice/crio-2595d6173389235a2c574ed11c78a95c28d061a64bd6a1de42037ec1ae6f7efe WatchSource:0}: Error finding container 2595d6173389235a2c574ed11c78a95c28d061a64bd6a1de42037ec1ae6f7efe: Status 404 returned error can't find the container with id 2595d6173389235a2c574ed11c78a95c28d061a64bd6a1de42037ec1ae6f7efe Apr 22 14:21:30.661057 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:30.661020 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" event={"ID":"80f1a0aa-1449-4d0c-9e29-a2372be638af","Type":"ContainerStarted","Data":"2595d6173389235a2c574ed11c78a95c28d061a64bd6a1de42037ec1ae6f7efe"} Apr 22 14:21:35.677672 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:35.677632 2575 generic.go:358] "Generic (PLEG): container finished" podID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerID="0cf2d398c74cd27acea65b5fa81933061bb9aa722d64a7efa5e9e8b31a2dea16" exitCode=0 Apr 22 14:21:35.678041 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:35.677717 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" event={"ID":"80f1a0aa-1449-4d0c-9e29-a2372be638af","Type":"ContainerDied","Data":"0cf2d398c74cd27acea65b5fa81933061bb9aa722d64a7efa5e9e8b31a2dea16"} Apr 22 14:21:37.684650 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:37.684618 2575 generic.go:358] "Generic (PLEG): container finished" podID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerID="73b34f1948a75c29040d9d776d813b6154183fdd7f778e8af58610c67acf8165" exitCode=0 Apr 22 14:21:37.684987 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:37.684700 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" event={"ID":"80f1a0aa-1449-4d0c-9e29-a2372be638af","Type":"ContainerDied","Data":"73b34f1948a75c29040d9d776d813b6154183fdd7f778e8af58610c67acf8165"} Apr 22 14:21:44.708138 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:44.708109 2575 generic.go:358] "Generic (PLEG): container finished" podID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerID="9a18a7f36f98a56e507145ec3e427f7ed7a7dace765d88562cb49bc42530549e" exitCode=0 Apr 22 14:21:44.708138 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:44.708142 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" event={"ID":"80f1a0aa-1449-4d0c-9e29-a2372be638af","Type":"ContainerDied","Data":"9a18a7f36f98a56e507145ec3e427f7ed7a7dace765d88562cb49bc42530549e"} Apr 22 14:21:45.834300 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.834279 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:45.983789 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.983694 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-bundle\") pod \"80f1a0aa-1449-4d0c-9e29-a2372be638af\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " Apr 22 14:21:45.983789 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.983768 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4n6hb\" (UniqueName: \"kubernetes.io/projected/80f1a0aa-1449-4d0c-9e29-a2372be638af-kube-api-access-4n6hb\") pod \"80f1a0aa-1449-4d0c-9e29-a2372be638af\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " Apr 22 14:21:45.984014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.983798 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-util\") pod \"80f1a0aa-1449-4d0c-9e29-a2372be638af\" (UID: \"80f1a0aa-1449-4d0c-9e29-a2372be638af\") " Apr 22 14:21:45.984275 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.984252 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-bundle" (OuterVolumeSpecName: "bundle") pod "80f1a0aa-1449-4d0c-9e29-a2372be638af" (UID: "80f1a0aa-1449-4d0c-9e29-a2372be638af"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:21:45.985947 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.985924 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80f1a0aa-1449-4d0c-9e29-a2372be638af-kube-api-access-4n6hb" (OuterVolumeSpecName: "kube-api-access-4n6hb") pod "80f1a0aa-1449-4d0c-9e29-a2372be638af" (UID: "80f1a0aa-1449-4d0c-9e29-a2372be638af"). InnerVolumeSpecName "kube-api-access-4n6hb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:21:45.988400 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:45.988382 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-util" (OuterVolumeSpecName: "util") pod "80f1a0aa-1449-4d0c-9e29-a2372be638af" (UID: "80f1a0aa-1449-4d0c-9e29-a2372be638af"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:21:46.084900 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:46.084876 2575 reconciler_common.go:299] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-util\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:21:46.084900 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:46.084897 2575 reconciler_common.go:299] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/80f1a0aa-1449-4d0c-9e29-a2372be638af-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:21:46.085029 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:46.084906 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4n6hb\" (UniqueName: \"kubernetes.io/projected/80f1a0aa-1449-4d0c-9e29-a2372be638af-kube-api-access-4n6hb\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:21:46.719452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:46.719417 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" event={"ID":"80f1a0aa-1449-4d0c-9e29-a2372be638af","Type":"ContainerDied","Data":"2595d6173389235a2c574ed11c78a95c28d061a64bd6a1de42037ec1ae6f7efe"} Apr 22 14:21:46.719452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:46.719450 2575 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2595d6173389235a2c574ed11c78a95c28d061a64bd6a1de42037ec1ae6f7efe" Apr 22 14:21:46.719452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:46.719450 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/59039e319e11338a40c6b6f1054d265f40bb50ceac6068d5c59955d29cfbssq" Apr 22 14:21:51.844641 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844610 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb"] Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844896 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="pull" Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844907 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="pull" Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844915 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="extract" Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844921 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="extract" Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844936 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="util" Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844941 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="util" Apr 22 14:21:51.845014 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.844993 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="80f1a0aa-1449-4d0c-9e29-a2372be638af" containerName="extract" Apr 22 14:21:51.851951 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.851928 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:51.856564 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.856545 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 14:21:51.856950 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.856928 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 14:21:51.857055 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.856933 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 14:21:51.857214 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.857198 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-p6nf2\"" Apr 22 14:21:51.867646 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:51.867626 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb"] Apr 22 14:21:52.026581 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.026554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nm96\" (UniqueName: \"kubernetes.io/projected/44cd70f6-4b5a-4342-9830-d208b6fc0863-kube-api-access-8nm96\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb\" (UID: \"44cd70f6-4b5a-4342-9830-d208b6fc0863\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.026713 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.026594 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/44cd70f6-4b5a-4342-9830-d208b6fc0863-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb\" (UID: \"44cd70f6-4b5a-4342-9830-d208b6fc0863\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.127500 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.127443 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8nm96\" (UniqueName: \"kubernetes.io/projected/44cd70f6-4b5a-4342-9830-d208b6fc0863-kube-api-access-8nm96\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb\" (UID: \"44cd70f6-4b5a-4342-9830-d208b6fc0863\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.127500 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.127478 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/44cd70f6-4b5a-4342-9830-d208b6fc0863-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb\" (UID: \"44cd70f6-4b5a-4342-9830-d208b6fc0863\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.129782 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.129745 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/44cd70f6-4b5a-4342-9830-d208b6fc0863-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb\" (UID: \"44cd70f6-4b5a-4342-9830-d208b6fc0863\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.158013 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.157996 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nm96\" (UniqueName: \"kubernetes.io/projected/44cd70f6-4b5a-4342-9830-d208b6fc0863-kube-api-access-8nm96\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb\" (UID: \"44cd70f6-4b5a-4342-9830-d208b6fc0863\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.161822 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.161806 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:52.290575 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.290552 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb"] Apr 22 14:21:52.292818 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:21:52.292791 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod44cd70f6_4b5a_4342_9830_d208b6fc0863.slice/crio-f7ad481d2f44dc625e62aad771320132740700ece12f5a0859de245ce01ec6c1 WatchSource:0}: Error finding container f7ad481d2f44dc625e62aad771320132740700ece12f5a0859de245ce01ec6c1: Status 404 returned error can't find the container with id f7ad481d2f44dc625e62aad771320132740700ece12f5a0859de245ce01ec6c1 Apr 22 14:21:52.737399 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:52.737364 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" event={"ID":"44cd70f6-4b5a-4342-9830-d208b6fc0863","Type":"ContainerStarted","Data":"f7ad481d2f44dc625e62aad771320132740700ece12f5a0859de245ce01ec6c1"} Apr 22 14:21:57.757198 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.757169 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" event={"ID":"44cd70f6-4b5a-4342-9830-d208b6fc0863","Type":"ContainerStarted","Data":"aaa69af8566816a8797e53c315f19c8822480671e72786ad6be5ffbc439e7ce1"} Apr 22 14:21:57.757577 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.757354 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:21:57.784884 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.784839 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" podStartSLOduration=1.851859938 podStartE2EDuration="6.784828076s" podCreationTimestamp="2026-04-22 14:21:51 +0000 UTC" firstStartedPulling="2026-04-22 14:21:52.294475732 +0000 UTC m=+387.225854613" lastFinishedPulling="2026-04-22 14:21:57.227443857 +0000 UTC m=+392.158822751" observedRunningTime="2026-04-22 14:21:57.781376026 +0000 UTC m=+392.712754930" watchObservedRunningTime="2026-04-22 14:21:57.784828076 +0000 UTC m=+392.716206979" Apr 22 14:21:57.852043 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.852016 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-w7g6l"] Apr 22 14:21:57.855124 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.855110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:57.860897 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.860874 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 14:21:57.861052 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.860881 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-mgmj2\"" Apr 22 14:21:57.861120 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.860881 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 14:21:57.868017 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.867998 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-w7g6l"] Apr 22 14:21:57.977580 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.977554 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-cabundle0\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:57.977683 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.977584 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:57.977683 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:57.977618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpdms\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-kube-api-access-vpdms\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.078361 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.078300 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vpdms\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-kube-api-access-vpdms\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.078451 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.078376 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-cabundle0\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.078451 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.078402 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.078599 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.078582 2575 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 14:21:58.078634 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.078604 2575 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:58.078634 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.078614 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:58.078634 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.078629 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-w7g6l: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:58.078721 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.078689 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates podName:2ead4b59-a2a0-4f81-bab0-bf4469320e1b nodeName:}" failed. No retries permitted until 2026-04-22 14:21:58.578675363 +0000 UTC m=+393.510054244 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates") pod "keda-operator-ffbb595cb-w7g6l" (UID: "2ead4b59-a2a0-4f81-bab0-bf4469320e1b") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:58.078995 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.078979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-cabundle0\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.088915 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.088896 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpdms\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-kube-api-access-vpdms\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.208433 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.208405 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj"] Apr 22 14:21:58.211726 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.211711 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.214394 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.214377 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 22 14:21:58.224246 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.224227 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj"] Apr 22 14:21:58.380778 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.380697 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d1a81135-d52c-4f9d-8384-3a24897244ca-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.380778 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.380727 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztgfh\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-kube-api-access-ztgfh\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.380929 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.380854 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.481678 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.481653 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.481839 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.481695 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d1a81135-d52c-4f9d-8384-3a24897244ca-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.481839 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.481714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztgfh\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-kube-api-access-ztgfh\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.481839 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.481819 2575 secret.go:281] references non-existent secret key: tls.crt Apr 22 14:21:58.481839 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.481838 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 22 14:21:58.482006 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.481858 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj: references non-existent secret key: tls.crt Apr 22 14:21:58.482006 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.481925 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-certificates podName:d1a81135-d52c-4f9d-8384-3a24897244ca nodeName:}" failed. No retries permitted until 2026-04-22 14:21:58.981909357 +0000 UTC m=+393.913288239 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-certificates") pod "keda-metrics-apiserver-7c9f485588-fd9dj" (UID: "d1a81135-d52c-4f9d-8384-3a24897244ca") : references non-existent secret key: tls.crt Apr 22 14:21:58.482143 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.482126 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/d1a81135-d52c-4f9d-8384-3a24897244ca-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.496646 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.496620 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-bg5vw"] Apr 22 14:21:58.499815 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.499797 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.500184 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.500164 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztgfh\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-kube-api-access-ztgfh\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.502534 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.502518 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 22 14:21:58.512138 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.512116 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bg5vw"] Apr 22 14:21:58.583046 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.583021 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:58.583180 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.583123 2575 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:58.583180 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.583134 2575 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:58.583180 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.583141 2575 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-w7g6l: references non-existent secret key: ca.crt Apr 22 14:21:58.583274 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:21:58.583187 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates podName:2ead4b59-a2a0-4f81-bab0-bf4469320e1b nodeName:}" failed. No retries permitted until 2026-04-22 14:21:59.583171942 +0000 UTC m=+394.514550827 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates") pod "keda-operator-ffbb595cb-w7g6l" (UID: "2ead4b59-a2a0-4f81-bab0-bf4469320e1b") : references non-existent secret key: ca.crt Apr 22 14:21:58.683817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.683789 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mwp5v\" (UniqueName: \"kubernetes.io/projected/fd1c090c-26f3-4404-ad3a-4ef31bd8a367-kube-api-access-mwp5v\") pod \"keda-admission-cf49989db-bg5vw\" (UID: \"fd1c090c-26f3-4404-ad3a-4ef31bd8a367\") " pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.683985 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.683827 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd1c090c-26f3-4404-ad3a-4ef31bd8a367-certificates\") pod \"keda-admission-cf49989db-bg5vw\" (UID: \"fd1c090c-26f3-4404-ad3a-4ef31bd8a367\") " pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.784851 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.784825 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd1c090c-26f3-4404-ad3a-4ef31bd8a367-certificates\") pod \"keda-admission-cf49989db-bg5vw\" (UID: \"fd1c090c-26f3-4404-ad3a-4ef31bd8a367\") " pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.785295 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.784958 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mwp5v\" (UniqueName: \"kubernetes.io/projected/fd1c090c-26f3-4404-ad3a-4ef31bd8a367-kube-api-access-mwp5v\") pod \"keda-admission-cf49989db-bg5vw\" (UID: \"fd1c090c-26f3-4404-ad3a-4ef31bd8a367\") " pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.787562 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.787542 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/fd1c090c-26f3-4404-ad3a-4ef31bd8a367-certificates\") pod \"keda-admission-cf49989db-bg5vw\" (UID: \"fd1c090c-26f3-4404-ad3a-4ef31bd8a367\") " pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.810175 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.810152 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mwp5v\" (UniqueName: \"kubernetes.io/projected/fd1c090c-26f3-4404-ad3a-4ef31bd8a367-kube-api-access-mwp5v\") pod \"keda-admission-cf49989db-bg5vw\" (UID: \"fd1c090c-26f3-4404-ad3a-4ef31bd8a367\") " pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.817203 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.817184 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:21:58.968190 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.968115 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-bg5vw"] Apr 22 14:21:58.971262 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:21:58.971237 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd1c090c_26f3_4404_ad3a_4ef31bd8a367.slice/crio-b67cd12fff26ab91532577a9c55d88f2c934f1123547c0449509c4a6a0fd22f5 WatchSource:0}: Error finding container b67cd12fff26ab91532577a9c55d88f2c934f1123547c0449509c4a6a0fd22f5: Status 404 returned error can't find the container with id b67cd12fff26ab91532577a9c55d88f2c934f1123547c0449509c4a6a0fd22f5 Apr 22 14:21:58.987421 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.987394 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:58.989705 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:58.989682 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/d1a81135-d52c-4f9d-8384-3a24897244ca-certificates\") pod \"keda-metrics-apiserver-7c9f485588-fd9dj\" (UID: \"d1a81135-d52c-4f9d-8384-3a24897244ca\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:59.121879 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.121846 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:21:59.249272 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.249246 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj"] Apr 22 14:21:59.251384 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:21:59.251357 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1a81135_d52c_4f9d_8384_3a24897244ca.slice/crio-786cafe9ce76d1d7c01e7f1aa3318878cc8e4545bd0726cc7348b99b7a807cad WatchSource:0}: Error finding container 786cafe9ce76d1d7c01e7f1aa3318878cc8e4545bd0726cc7348b99b7a807cad: Status 404 returned error can't find the container with id 786cafe9ce76d1d7c01e7f1aa3318878cc8e4545bd0726cc7348b99b7a807cad Apr 22 14:21:59.591888 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.591566 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:59.594347 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.594324 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/2ead4b59-a2a0-4f81-bab0-bf4469320e1b-certificates\") pod \"keda-operator-ffbb595cb-w7g6l\" (UID: \"2ead4b59-a2a0-4f81-bab0-bf4469320e1b\") " pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:59.671139 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.671110 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:21:59.765437 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.765393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" event={"ID":"d1a81135-d52c-4f9d-8384-3a24897244ca","Type":"ContainerStarted","Data":"786cafe9ce76d1d7c01e7f1aa3318878cc8e4545bd0726cc7348b99b7a807cad"} Apr 22 14:21:59.766447 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.766405 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bg5vw" event={"ID":"fd1c090c-26f3-4404-ad3a-4ef31bd8a367","Type":"ContainerStarted","Data":"b67cd12fff26ab91532577a9c55d88f2c934f1123547c0449509c4a6a0fd22f5"} Apr 22 14:21:59.830914 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:21:59.830879 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-w7g6l"] Apr 22 14:21:59.833703 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:21:59.833671 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ead4b59_a2a0_4f81_bab0_bf4469320e1b.slice/crio-48f11042e2aaa0ca5e768b5958a9aa41fb669b3b9574b9df9c98a41af17b7470 WatchSource:0}: Error finding container 48f11042e2aaa0ca5e768b5958a9aa41fb669b3b9574b9df9c98a41af17b7470: Status 404 returned error can't find the container with id 48f11042e2aaa0ca5e768b5958a9aa41fb669b3b9574b9df9c98a41af17b7470 Apr 22 14:22:00.772197 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:00.772159 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" event={"ID":"2ead4b59-a2a0-4f81-bab0-bf4469320e1b","Type":"ContainerStarted","Data":"48f11042e2aaa0ca5e768b5958a9aa41fb669b3b9574b9df9c98a41af17b7470"} Apr 22 14:22:00.774282 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:00.774249 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-bg5vw" event={"ID":"fd1c090c-26f3-4404-ad3a-4ef31bd8a367","Type":"ContainerStarted","Data":"b03e46a673ad4475241b93294aaf9dacbbbb806e4fb8a151f965a220527b025e"} Apr 22 14:22:00.774403 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:00.774380 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:22:00.811061 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:00.811018 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-bg5vw" podStartSLOduration=1.53190999 podStartE2EDuration="2.811004733s" podCreationTimestamp="2026-04-22 14:21:58 +0000 UTC" firstStartedPulling="2026-04-22 14:21:58.972583942 +0000 UTC m=+393.903962830" lastFinishedPulling="2026-04-22 14:22:00.251678692 +0000 UTC m=+395.183057573" observedRunningTime="2026-04-22 14:22:00.81019565 +0000 UTC m=+395.741574551" watchObservedRunningTime="2026-04-22 14:22:00.811004733 +0000 UTC m=+395.742383636" Apr 22 14:22:02.784430 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:02.784393 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" event={"ID":"d1a81135-d52c-4f9d-8384-3a24897244ca","Type":"ContainerStarted","Data":"a1630e60ae548441c3e9a4e05f6866f323b314e8301d61387b1fba9651745819"} Apr 22 14:22:02.784900 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:02.784503 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:22:02.807134 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:02.807073 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" podStartSLOduration=1.9545186 podStartE2EDuration="4.807055922s" podCreationTimestamp="2026-04-22 14:21:58 +0000 UTC" firstStartedPulling="2026-04-22 14:21:59.252735533 +0000 UTC m=+394.184114416" lastFinishedPulling="2026-04-22 14:22:02.105272834 +0000 UTC m=+397.036651738" observedRunningTime="2026-04-22 14:22:02.80677506 +0000 UTC m=+397.738153964" watchObservedRunningTime="2026-04-22 14:22:02.807055922 +0000 UTC m=+397.738434826" Apr 22 14:22:03.789075 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:03.789038 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" event={"ID":"2ead4b59-a2a0-4f81-bab0-bf4469320e1b","Type":"ContainerStarted","Data":"c118f0c457ee4730b74296e153a14a1ffcdef8542b46f7d523b703450f22ecf8"} Apr 22 14:22:03.789486 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:03.789285 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:22:03.809145 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:03.809089 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" podStartSLOduration=3.340597661 podStartE2EDuration="6.809073894s" podCreationTimestamp="2026-04-22 14:21:57 +0000 UTC" firstStartedPulling="2026-04-22 14:21:59.835412193 +0000 UTC m=+394.766791073" lastFinishedPulling="2026-04-22 14:22:03.303888408 +0000 UTC m=+398.235267306" observedRunningTime="2026-04-22 14:22:03.806880804 +0000 UTC m=+398.738259718" watchObservedRunningTime="2026-04-22 14:22:03.809073894 +0000 UTC m=+398.740452796" Apr 22 14:22:13.793254 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:13.793224 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-fd9dj" Apr 22 14:22:18.762900 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:18.762874 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-2wkzb" Apr 22 14:22:21.781023 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:21.780990 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-bg5vw" Apr 22 14:22:24.794400 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:22:24.794368 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-w7g6l" Apr 22 14:23:11.859158 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.859122 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-dk7nv"] Apr 22 14:23:11.862317 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.862295 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:11.868291 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.868256 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8c95w"] Apr 22 14:23:11.871209 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.871184 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 22 14:23:11.871335 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.871183 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 22 14:23:11.871696 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.871679 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:11.872094 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.872077 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-5ns6h\"" Apr 22 14:23:11.872509 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.872493 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 22 14:23:11.874708 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.874690 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-2hcgm\"" Apr 22 14:23:11.874944 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.874922 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 22 14:23:11.886422 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.886403 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-dk7nv"] Apr 22 14:23:11.906875 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.906853 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8c95w"] Apr 22 14:23:11.959948 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.959922 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv798\" (UniqueName: \"kubernetes.io/projected/d9e7df5a-959a-480f-a726-48fec9e85ab0-kube-api-access-mv798\") pod \"llmisvc-controller-manager-68cc5db7c4-8c95w\" (UID: \"d9e7df5a-959a-480f-a726-48fec9e85ab0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:11.960059 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.959967 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dzrx\" (UniqueName: \"kubernetes.io/projected/5573b29a-5135-48e8-94b9-90474b5003b9-kube-api-access-2dzrx\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:11.960059 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.959997 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:11.960059 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:11.960039 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9e7df5a-959a-480f-a726-48fec9e85ab0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8c95w\" (UID: \"d9e7df5a-959a-480f-a726-48fec9e85ab0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:12.060600 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.060573 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv798\" (UniqueName: \"kubernetes.io/projected/d9e7df5a-959a-480f-a726-48fec9e85ab0-kube-api-access-mv798\") pod \"llmisvc-controller-manager-68cc5db7c4-8c95w\" (UID: \"d9e7df5a-959a-480f-a726-48fec9e85ab0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:12.060769 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.060607 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2dzrx\" (UniqueName: \"kubernetes.io/projected/5573b29a-5135-48e8-94b9-90474b5003b9-kube-api-access-2dzrx\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:12.060769 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.060635 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:12.060769 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.060683 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9e7df5a-959a-480f-a726-48fec9e85ab0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8c95w\" (UID: \"d9e7df5a-959a-480f-a726-48fec9e85ab0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:12.060879 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:23:12.060809 2575 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 14:23:12.060912 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:23:12.060879 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert podName:5573b29a-5135-48e8-94b9-90474b5003b9 nodeName:}" failed. No retries permitted until 2026-04-22 14:23:12.560864337 +0000 UTC m=+467.492243223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert") pod "kserve-controller-manager-66cf78b85b-dk7nv" (UID: "5573b29a-5135-48e8-94b9-90474b5003b9") : secret "kserve-webhook-server-cert" not found Apr 22 14:23:12.063016 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.062988 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d9e7df5a-959a-480f-a726-48fec9e85ab0-cert\") pod \"llmisvc-controller-manager-68cc5db7c4-8c95w\" (UID: \"d9e7df5a-959a-480f-a726-48fec9e85ab0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:12.072234 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.072214 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dzrx\" (UniqueName: \"kubernetes.io/projected/5573b29a-5135-48e8-94b9-90474b5003b9-kube-api-access-2dzrx\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:12.073949 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.073928 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv798\" (UniqueName: \"kubernetes.io/projected/d9e7df5a-959a-480f-a726-48fec9e85ab0-kube-api-access-mv798\") pod \"llmisvc-controller-manager-68cc5db7c4-8c95w\" (UID: \"d9e7df5a-959a-480f-a726-48fec9e85ab0\") " pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:12.182519 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.182483 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:12.308928 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.308808 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-68cc5db7c4-8c95w"] Apr 22 14:23:12.311648 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:23:12.311620 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd9e7df5a_959a_480f_a726_48fec9e85ab0.slice/crio-68093e9a8396b21a6c3c92b8a9e0245fcd84d49b83a3001e1440f43ecb681aa2 WatchSource:0}: Error finding container 68093e9a8396b21a6c3c92b8a9e0245fcd84d49b83a3001e1440f43ecb681aa2: Status 404 returned error can't find the container with id 68093e9a8396b21a6c3c92b8a9e0245fcd84d49b83a3001e1440f43ecb681aa2 Apr 22 14:23:12.566468 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:12.566385 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:12.566613 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:23:12.566499 2575 secret.go:189] Couldn't get secret kserve/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 14:23:12.566613 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:23:12.566561 2575 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert podName:5573b29a-5135-48e8-94b9-90474b5003b9 nodeName:}" failed. No retries permitted until 2026-04-22 14:23:13.566547538 +0000 UTC m=+468.497926418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert") pod "kserve-controller-manager-66cf78b85b-dk7nv" (UID: "5573b29a-5135-48e8-94b9-90474b5003b9") : secret "kserve-webhook-server-cert" not found Apr 22 14:23:13.006516 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:13.006474 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" event={"ID":"d9e7df5a-959a-480f-a726-48fec9e85ab0","Type":"ContainerStarted","Data":"68093e9a8396b21a6c3c92b8a9e0245fcd84d49b83a3001e1440f43ecb681aa2"} Apr 22 14:23:13.577057 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:13.577018 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:13.579816 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:13.579788 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") pod \"kserve-controller-manager-66cf78b85b-dk7nv\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:13.674674 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:13.674637 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:13.853713 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:13.853642 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-dk7nv"] Apr 22 14:23:14.282623 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:23:14.282568 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5573b29a_5135_48e8_94b9_90474b5003b9.slice/crio-7f4153963c5cccf11079cfbe80e849938a349d6bc984e681e777c62e8fe82bf1 WatchSource:0}: Error finding container 7f4153963c5cccf11079cfbe80e849938a349d6bc984e681e777c62e8fe82bf1: Status 404 returned error can't find the container with id 7f4153963c5cccf11079cfbe80e849938a349d6bc984e681e777c62e8fe82bf1 Apr 22 14:23:15.014651 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:15.014618 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" event={"ID":"5573b29a-5135-48e8-94b9-90474b5003b9","Type":"ContainerStarted","Data":"7f4153963c5cccf11079cfbe80e849938a349d6bc984e681e777c62e8fe82bf1"} Apr 22 14:23:15.016375 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:15.016336 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" event={"ID":"d9e7df5a-959a-480f-a726-48fec9e85ab0","Type":"ContainerStarted","Data":"c8d99ae372f4b1463209b91950159372e84fe67fc021138372057b3c852af834"} Apr 22 14:23:15.016692 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:15.016654 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:15.122000 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:15.121942 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" podStartSLOduration=2.1038752450000002 podStartE2EDuration="4.121928079s" podCreationTimestamp="2026-04-22 14:23:11 +0000 UTC" firstStartedPulling="2026-04-22 14:23:12.312852705 +0000 UTC m=+467.244231586" lastFinishedPulling="2026-04-22 14:23:14.330905538 +0000 UTC m=+469.262284420" observedRunningTime="2026-04-22 14:23:15.118647001 +0000 UTC m=+470.050025909" watchObservedRunningTime="2026-04-22 14:23:15.121928079 +0000 UTC m=+470.053306982" Apr 22 14:23:17.024209 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:17.024123 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" event={"ID":"5573b29a-5135-48e8-94b9-90474b5003b9","Type":"ContainerStarted","Data":"9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685"} Apr 22 14:23:17.024549 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:17.024218 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:17.059744 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:17.059695 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" podStartSLOduration=3.691511187 podStartE2EDuration="6.059681317s" podCreationTimestamp="2026-04-22 14:23:11 +0000 UTC" firstStartedPulling="2026-04-22 14:23:14.283814902 +0000 UTC m=+469.215193785" lastFinishedPulling="2026-04-22 14:23:16.651985031 +0000 UTC m=+471.583363915" observedRunningTime="2026-04-22 14:23:17.059250031 +0000 UTC m=+471.990628933" watchObservedRunningTime="2026-04-22 14:23:17.059681317 +0000 UTC m=+471.991060259" Apr 22 14:23:46.022532 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.022490 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-68cc5db7c4-8c95w" Apr 22 14:23:46.184722 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.184691 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b6df4bdb9-jmhmv"] Apr 22 14:23:46.191466 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.191441 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.202057 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.202035 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6df4bdb9-jmhmv"] Apr 22 14:23:46.319524 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319446 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-service-ca\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.319524 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319489 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-trusted-ca-bundle\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.319701 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319547 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bngxh\" (UniqueName: \"kubernetes.io/projected/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-kube-api-access-bngxh\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.319701 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319590 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-serving-cert\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.319701 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319618 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-config\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.319701 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319681 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-oauth-serving-cert\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.319853 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.319705 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-oauth-config\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.420671 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.420636 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-oauth-serving-cert\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.420849 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.420678 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-oauth-config\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.420849 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.420723 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-service-ca\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.420958 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.420881 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-trusted-ca-bundle\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.420958 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.420928 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bngxh\" (UniqueName: \"kubernetes.io/projected/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-kube-api-access-bngxh\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.421069 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.420964 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-serving-cert\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.421069 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.421002 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-config\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.421572 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.421544 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-service-ca\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.421572 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.421564 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-oauth-serving-cert\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.421743 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.421687 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-config\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.421743 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.421692 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-trusted-ca-bundle\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.423046 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.423026 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-oauth-config\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.423306 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.423287 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-console-serving-cert\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.430719 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.430696 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bngxh\" (UniqueName: \"kubernetes.io/projected/55b4ee4d-9c48-4383-a50b-33e5c7ab6532-kube-api-access-bngxh\") pod \"console-b6df4bdb9-jmhmv\" (UID: \"55b4ee4d-9c48-4383-a50b-33e5c7ab6532\") " pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.501478 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.501452 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:46.623928 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:46.623893 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b6df4bdb9-jmhmv"] Apr 22 14:23:46.626200 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:23:46.626170 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod55b4ee4d_9c48_4383_a50b_33e5c7ab6532.slice/crio-7f1f7e7e0528795d7ff977d42d89549d408ecc22fdfb627f70c144a6bece340e WatchSource:0}: Error finding container 7f1f7e7e0528795d7ff977d42d89549d408ecc22fdfb627f70c144a6bece340e: Status 404 returned error can't find the container with id 7f1f7e7e0528795d7ff977d42d89549d408ecc22fdfb627f70c144a6bece340e Apr 22 14:23:47.123110 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.123078 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6df4bdb9-jmhmv" event={"ID":"55b4ee4d-9c48-4383-a50b-33e5c7ab6532","Type":"ContainerStarted","Data":"4a69a873c56e1c8f0989da26d241705c68d6d15aa0fa2c55ce3cc3edfa8f3ed6"} Apr 22 14:23:47.123110 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.123110 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b6df4bdb9-jmhmv" event={"ID":"55b4ee4d-9c48-4383-a50b-33e5c7ab6532","Type":"ContainerStarted","Data":"7f1f7e7e0528795d7ff977d42d89549d408ecc22fdfb627f70c144a6bece340e"} Apr 22 14:23:47.155509 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.155470 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b6df4bdb9-jmhmv" podStartSLOduration=1.155458352 podStartE2EDuration="1.155458352s" podCreationTimestamp="2026-04-22 14:23:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:23:47.155197831 +0000 UTC m=+502.086576747" watchObservedRunningTime="2026-04-22 14:23:47.155458352 +0000 UTC m=+502.086837254" Apr 22 14:23:47.505972 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.505948 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-dk7nv"] Apr 22 14:23:47.506197 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.506173 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" podUID="5573b29a-5135-48e8-94b9-90474b5003b9" containerName="manager" containerID="cri-o://9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685" gracePeriod=10 Apr 22 14:23:47.511300 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.511281 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:47.536553 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.536531 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-pfht4"] Apr 22 14:23:47.539845 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.539818 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.549563 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.549537 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-pfht4"] Apr 22 14:23:47.630802 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.630775 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrj4j\" (UniqueName: \"kubernetes.io/projected/5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a-kube-api-access-vrj4j\") pod \"kserve-controller-manager-66cf78b85b-pfht4\" (UID: \"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a\") " pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.630901 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.630858 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a-cert\") pod \"kserve-controller-manager-66cf78b85b-pfht4\" (UID: \"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a\") " pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.731677 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.731654 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a-cert\") pod \"kserve-controller-manager-66cf78b85b-pfht4\" (UID: \"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a\") " pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.731788 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.731714 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vrj4j\" (UniqueName: \"kubernetes.io/projected/5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a-kube-api-access-vrj4j\") pod \"kserve-controller-manager-66cf78b85b-pfht4\" (UID: \"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a\") " pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.734003 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.733979 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a-cert\") pod \"kserve-controller-manager-66cf78b85b-pfht4\" (UID: \"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a\") " pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.740388 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.740367 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrj4j\" (UniqueName: \"kubernetes.io/projected/5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a-kube-api-access-vrj4j\") pod \"kserve-controller-manager-66cf78b85b-pfht4\" (UID: \"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a\") " pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.745029 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.745013 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:47.832817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.832738 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") pod \"5573b29a-5135-48e8-94b9-90474b5003b9\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " Apr 22 14:23:47.832817 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.832798 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2dzrx\" (UniqueName: \"kubernetes.io/projected/5573b29a-5135-48e8-94b9-90474b5003b9-kube-api-access-2dzrx\") pod \"5573b29a-5135-48e8-94b9-90474b5003b9\" (UID: \"5573b29a-5135-48e8-94b9-90474b5003b9\") " Apr 22 14:23:47.834712 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.834691 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert" (OuterVolumeSpecName: "cert") pod "5573b29a-5135-48e8-94b9-90474b5003b9" (UID: "5573b29a-5135-48e8-94b9-90474b5003b9"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:47.834813 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.834713 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5573b29a-5135-48e8-94b9-90474b5003b9-kube-api-access-2dzrx" (OuterVolumeSpecName: "kube-api-access-2dzrx") pod "5573b29a-5135-48e8-94b9-90474b5003b9" (UID: "5573b29a-5135-48e8-94b9-90474b5003b9"). InnerVolumeSpecName "kube-api-access-2dzrx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:47.888287 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.888262 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:47.934391 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.934365 2575 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5573b29a-5135-48e8-94b9-90474b5003b9-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:23:47.934488 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:47.934400 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2dzrx\" (UniqueName: \"kubernetes.io/projected/5573b29a-5135-48e8-94b9-90474b5003b9-kube-api-access-2dzrx\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:23:48.009447 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.009421 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-pfht4"] Apr 22 14:23:48.011790 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:23:48.011766 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d0c833d_c2c9_4b9d_bdcf_3d47aaf26f9a.slice/crio-1c6433fd24af2ca6d508707838e40f9c30ef53d2b9dd43dcb025e232bb343916 WatchSource:0}: Error finding container 1c6433fd24af2ca6d508707838e40f9c30ef53d2b9dd43dcb025e232bb343916: Status 404 returned error can't find the container with id 1c6433fd24af2ca6d508707838e40f9c30ef53d2b9dd43dcb025e232bb343916 Apr 22 14:23:48.127576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.127511 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" event={"ID":"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a","Type":"ContainerStarted","Data":"1c6433fd24af2ca6d508707838e40f9c30ef53d2b9dd43dcb025e232bb343916"} Apr 22 14:23:48.128592 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.128557 2575 generic.go:358] "Generic (PLEG): container finished" podID="5573b29a-5135-48e8-94b9-90474b5003b9" containerID="9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685" exitCode=0 Apr 22 14:23:48.128681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.128614 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" event={"ID":"5573b29a-5135-48e8-94b9-90474b5003b9","Type":"ContainerDied","Data":"9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685"} Apr 22 14:23:48.128681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.128621 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" Apr 22 14:23:48.128681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.128648 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-dk7nv" event={"ID":"5573b29a-5135-48e8-94b9-90474b5003b9","Type":"ContainerDied","Data":"7f4153963c5cccf11079cfbe80e849938a349d6bc984e681e777c62e8fe82bf1"} Apr 22 14:23:48.128681 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.128665 2575 scope.go:117] "RemoveContainer" containerID="9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685" Apr 22 14:23:48.136888 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.136869 2575 scope.go:117] "RemoveContainer" containerID="9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685" Apr 22 14:23:48.137114 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:23:48.137097 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685\": container with ID starting with 9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685 not found: ID does not exist" containerID="9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685" Apr 22 14:23:48.137166 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.137121 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685"} err="failed to get container status \"9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685\": rpc error: code = NotFound desc = could not find container \"9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685\": container with ID starting with 9d1e4506466875aca37204188561df5b4515d664680150aa6c25cead49046685 not found: ID does not exist" Apr 22 14:23:48.156924 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.156903 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-dk7nv"] Apr 22 14:23:48.162355 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:48.162333 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-66cf78b85b-dk7nv"] Apr 22 14:23:49.133071 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:49.133040 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" event={"ID":"5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a","Type":"ContainerStarted","Data":"9af175497aca7997197ffd9dbe9ee23e37be5684d52287f204d50391a9933eca"} Apr 22 14:23:49.133531 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:49.133108 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:23:49.153806 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:49.153769 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" podStartSLOduration=1.833152353 podStartE2EDuration="2.15374075s" podCreationTimestamp="2026-04-22 14:23:47 +0000 UTC" firstStartedPulling="2026-04-22 14:23:48.013096416 +0000 UTC m=+502.944475298" lastFinishedPulling="2026-04-22 14:23:48.333684813 +0000 UTC m=+503.265063695" observedRunningTime="2026-04-22 14:23:49.151038624 +0000 UTC m=+504.082417531" watchObservedRunningTime="2026-04-22 14:23:49.15374075 +0000 UTC m=+504.085119653" Apr 22 14:23:49.562998 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:49.562960 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5573b29a-5135-48e8-94b9-90474b5003b9" path="/var/lib/kubelet/pods/5573b29a-5135-48e8-94b9-90474b5003b9/volumes" Apr 22 14:23:56.502516 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:56.502478 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:56.503004 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:56.502525 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:56.506997 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:56.506974 2575 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:57.164536 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:57.164507 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b6df4bdb9-jmhmv" Apr 22 14:23:57.221885 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:23:57.221853 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-589f6fbd59-wpphn"] Apr 22 14:24:20.141945 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:20.141871 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-66cf78b85b-pfht4" Apr 22 14:24:22.246390 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.246350 2575 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-589f6fbd59-wpphn" podUID="e42c29c3-328c-4626-a9ab-e26b2e8f4036" containerName="console" containerID="cri-o://d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2" gracePeriod=15 Apr 22 14:24:22.489833 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.489813 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-589f6fbd59-wpphn_e42c29c3-328c-4626-a9ab-e26b2e8f4036/console/0.log" Apr 22 14:24:22.489938 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.489872 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:24:22.595195 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595112 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-trusted-ca-bundle\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595195 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595151 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-config\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595195 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595193 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-service-ca\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595215 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-serving-cert\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595263 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q288v\" (UniqueName: \"kubernetes.io/projected/e42c29c3-328c-4626-a9ab-e26b2e8f4036-kube-api-access-q288v\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595293 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-oauth-config\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595456 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595325 2575 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-oauth-serving-cert\") pod \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\" (UID: \"e42c29c3-328c-4626-a9ab-e26b2e8f4036\") " Apr 22 14:24:22.595635 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595530 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:22.595635 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595613 2575 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-trusted-ca-bundle\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:22.595732 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595636 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-service-ca" (OuterVolumeSpecName: "service-ca") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:22.595732 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595643 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-config" (OuterVolumeSpecName: "console-config") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:22.595966 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.595941 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:24:22.597644 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.597618 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:24:22.598028 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.598001 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e42c29c3-328c-4626-a9ab-e26b2e8f4036-kube-api-access-q288v" (OuterVolumeSpecName: "kube-api-access-q288v") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "kube-api-access-q288v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:22.598028 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.598017 2575 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e42c29c3-328c-4626-a9ab-e26b2e8f4036" (UID: "e42c29c3-328c-4626-a9ab-e26b2e8f4036"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:24:22.696576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.696547 2575 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-oauth-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:22.696576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.696573 2575 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-oauth-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:22.696576 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.696583 2575 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-config\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:22.696799 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.696593 2575 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e42c29c3-328c-4626-a9ab-e26b2e8f4036-service-ca\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:22.696799 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.696601 2575 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e42c29c3-328c-4626-a9ab-e26b2e8f4036-console-serving-cert\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:22.696799 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:22.696610 2575 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q288v\" (UniqueName: \"kubernetes.io/projected/e42c29c3-328c-4626-a9ab-e26b2e8f4036-kube-api-access-q288v\") on node \"ip-10-0-130-98.ec2.internal\" DevicePath \"\"" Apr 22 14:24:23.249369 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.249345 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-589f6fbd59-wpphn_e42c29c3-328c-4626-a9ab-e26b2e8f4036/console/0.log" Apr 22 14:24:23.249792 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.249384 2575 generic.go:358] "Generic (PLEG): container finished" podID="e42c29c3-328c-4626-a9ab-e26b2e8f4036" containerID="d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2" exitCode=2 Apr 22 14:24:23.249792 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.249419 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589f6fbd59-wpphn" event={"ID":"e42c29c3-328c-4626-a9ab-e26b2e8f4036","Type":"ContainerDied","Data":"d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2"} Apr 22 14:24:23.249792 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.249458 2575 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-589f6fbd59-wpphn" Apr 22 14:24:23.249792 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.249469 2575 scope.go:117] "RemoveContainer" containerID="d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2" Apr 22 14:24:23.249792 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.249460 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-589f6fbd59-wpphn" event={"ID":"e42c29c3-328c-4626-a9ab-e26b2e8f4036","Type":"ContainerDied","Data":"1be676421cc9ea011da728dea403928efaf5fd5b7a9d06f37a7073eb74e3ab04"} Apr 22 14:24:23.261144 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.261126 2575 scope.go:117] "RemoveContainer" containerID="d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2" Apr 22 14:24:23.261415 ip-10-0-130-98 kubenswrapper[2575]: E0422 14:24:23.261395 2575 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2\": container with ID starting with d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2 not found: ID does not exist" containerID="d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2" Apr 22 14:24:23.261483 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.261426 2575 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2"} err="failed to get container status \"d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2\": rpc error: code = NotFound desc = could not find container \"d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2\": container with ID starting with d74bd2aba84dce3631dcda91812997001725278f2a25f8e2eb13ffa68367e0f2 not found: ID does not exist" Apr 22 14:24:23.273404 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.273380 2575 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-589f6fbd59-wpphn"] Apr 22 14:24:23.280170 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.280151 2575 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-589f6fbd59-wpphn"] Apr 22 14:24:23.561446 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:24:23.561371 2575 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e42c29c3-328c-4626-a9ab-e26b2e8f4036" path="/var/lib/kubelet/pods/e42c29c3-328c-4626-a9ab-e26b2e8f4036/volumes" Apr 22 14:25:25.463151 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:25:25.463122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:25:25.465032 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:25:25.465009 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:25:25.466016 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:25:25.466000 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:25:25.467853 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:25:25.467831 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:30:25.486911 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:30:25.486883 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:30:25.487407 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:30:25.486937 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:30:25.489730 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:30:25.489714 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:30:25.489857 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:30:25.489744 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:35:25.514188 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:35:25.514159 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:35:25.515971 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:35:25.515946 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:35:25.517706 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:35:25.517682 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:35:25.519219 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:35:25.519198 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:38:53.315477 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:53.315447 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hxhjd_4d18f912-d276-4fce-b5f0-e7157304038e/global-pull-secret-syncer/0.log" Apr 22 14:38:53.390118 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:53.390087 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vj7vs_fc6918bb-f78e-49c2-a990-f709907bd409/konnectivity-agent/0.log" Apr 22 14:38:53.457945 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:53.457916 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-98.ec2.internal_148a376889eed68cdbc344a2d41f35d5/haproxy/0.log" Apr 22 14:38:57.187444 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.187415 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hbd7w_d9df1470-48cd-4fb5-9710-be943e19f26c/kube-state-metrics/0.log" Apr 22 14:38:57.206587 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.206561 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hbd7w_d9df1470-48cd-4fb5-9710-be943e19f26c/kube-rbac-proxy-main/0.log" Apr 22 14:38:57.228517 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.228480 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-hbd7w_d9df1470-48cd-4fb5-9710-be943e19f26c/kube-rbac-proxy-self/0.log" Apr 22 14:38:57.255489 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.255463 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-54bcf9ffdd-qxdmm_208bd1e8-feae-4ec2-8277-d633ae78860c/metrics-server/0.log" Apr 22 14:38:57.374718 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.374693 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sxqr_e0a1aa13-8dd5-4a73-abee-ccd132aef2c4/node-exporter/0.log" Apr 22 14:38:57.405565 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.405509 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sxqr_e0a1aa13-8dd5-4a73-abee-ccd132aef2c4/kube-rbac-proxy/0.log" Apr 22 14:38:57.464308 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.464230 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-5sxqr_e0a1aa13-8dd5-4a73-abee-ccd132aef2c4/init-textfile/0.log" Apr 22 14:38:57.775550 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.775473 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-xvpq6_193c0bd5-08c3-4fec-a639-8c49c3979a84/prometheus-operator/0.log" Apr 22 14:38:57.801053 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.801031 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-xvpq6_193c0bd5-08c3-4fec-a639-8c49c3979a84/kube-rbac-proxy/0.log" Apr 22 14:38:57.858821 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.858795 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-855b7c769d-2gbh4_ff041a88-ef61-414c-b1af-87cc9f897125/telemeter-client/0.log" Apr 22 14:38:57.884472 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.884448 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-855b7c769d-2gbh4_ff041a88-ef61-414c-b1af-87cc9f897125/reload/0.log" Apr 22 14:38:57.914636 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:57.914614 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-855b7c769d-2gbh4_ff041a88-ef61-414c-b1af-87cc9f897125/kube-rbac-proxy/0.log" Apr 22 14:38:59.112419 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:59.112389 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-sfp7r_d8e714b0-eddd-43c6-9e24-c61be40fa7f2/networking-console-plugin/0.log" Apr 22 14:38:59.483635 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:59.483610 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/2.log" Apr 22 14:38:59.487495 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:59.487479 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-2lphb_ffe149ac-1ad0-48e9-9e0c-461c55ebc4fa/console-operator/3.log" Apr 22 14:38:59.864735 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:38:59.864664 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b6df4bdb9-jmhmv_55b4ee4d-9c48-4383-a50b-33e5c7ab6532/console/0.log" Apr 22 14:39:00.255297 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.255272 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-lsmjj_fa3ac17e-df93-4459-ba78-9173887dc2e3/volume-data-source-validator/0.log" Apr 22 14:39:00.408110 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408081 2575 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv"] Apr 22 14:39:00.408408 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408397 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5573b29a-5135-48e8-94b9-90474b5003b9" containerName="manager" Apr 22 14:39:00.408452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408410 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="5573b29a-5135-48e8-94b9-90474b5003b9" containerName="manager" Apr 22 14:39:00.408452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408429 2575 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e42c29c3-328c-4626-a9ab-e26b2e8f4036" containerName="console" Apr 22 14:39:00.408452 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408435 2575 state_mem.go:107] "Deleted CPUSet assignment" podUID="e42c29c3-328c-4626-a9ab-e26b2e8f4036" containerName="console" Apr 22 14:39:00.408544 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408484 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="5573b29a-5135-48e8-94b9-90474b5003b9" containerName="manager" Apr 22 14:39:00.408544 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.408493 2575 memory_manager.go:356] "RemoveStaleState removing state" podUID="e42c29c3-328c-4626-a9ab-e26b2e8f4036" containerName="console" Apr 22 14:39:00.411481 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.411466 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.414463 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.414447 2575 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-bgl78\"/\"default-dockercfg-qvmhq\"" Apr 22 14:39:00.415806 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.415793 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bgl78\"/\"kube-root-ca.crt\"" Apr 22 14:39:00.415872 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.415805 2575 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-bgl78\"/\"openshift-service-ca.crt\"" Apr 22 14:39:00.425153 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.425133 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv"] Apr 22 14:39:00.484947 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.484926 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-proc\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.485049 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.484955 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-sys\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.485049 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.484976 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67bk2\" (UniqueName: \"kubernetes.io/projected/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-kube-api-access-67bk2\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.485049 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.485025 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-lib-modules\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.485197 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.485103 2575 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-podres\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585570 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585513 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-lib-modules\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585570 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585561 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-podres\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585719 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585675 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-podres\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585719 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585676 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-proc\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585719 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585695 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-lib-modules\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585719 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585715 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-sys\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585910 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585717 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-proc\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585910 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585743 2575 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67bk2\" (UniqueName: \"kubernetes.io/projected/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-kube-api-access-67bk2\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.585910 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.585801 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-sys\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.594222 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.594203 2575 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67bk2\" (UniqueName: \"kubernetes.io/projected/e2cea33b-e5a7-413f-ad68-756b4aed0c7c-kube-api-access-67bk2\") pod \"perf-node-gather-daemonset-65wsv\" (UID: \"e2cea33b-e5a7-413f-ad68-756b4aed0c7c\") " pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.720644 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.720626 2575 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:00.841225 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.841199 2575 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv"] Apr 22 14:39:00.843375 ip-10-0-130-98 kubenswrapper[2575]: W0422 14:39:00.843353 2575 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode2cea33b_e5a7_413f_ad68_756b4aed0c7c.slice/crio-cd4f55fee850e670b762f76c83e73d22911b27f87290745ed24e25f70ee8cc2a WatchSource:0}: Error finding container cd4f55fee850e670b762f76c83e73d22911b27f87290745ed24e25f70ee8cc2a: Status 404 returned error can't find the container with id cd4f55fee850e670b762f76c83e73d22911b27f87290745ed24e25f70ee8cc2a Apr 22 14:39:00.845264 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.845248 2575 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:39:00.945312 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.945288 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l4vdv_b515945c-4a63-4512-9132-79ffc9f58ef0/dns/0.log" Apr 22 14:39:00.965554 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:00.965532 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-l4vdv_b515945c-4a63-4512-9132-79ffc9f58ef0/kube-rbac-proxy/0.log" Apr 22 14:39:01.045906 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.045884 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-9x8pf_1dfd7d57-a9b2-4910-82a6-1e9bf8576804/dns-node-resolver/0.log" Apr 22 14:39:01.104686 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.104636 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" event={"ID":"e2cea33b-e5a7-413f-ad68-756b4aed0c7c","Type":"ContainerStarted","Data":"cccb6c68d790d5f11597f38883ca4caef693a0e09837eb069e983b24d0c382c1"} Apr 22 14:39:01.104686 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.104663 2575 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" event={"ID":"e2cea33b-e5a7-413f-ad68-756b4aed0c7c","Type":"ContainerStarted","Data":"cd4f55fee850e670b762f76c83e73d22911b27f87290745ed24e25f70ee8cc2a"} Apr 22 14:39:01.104850 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.104724 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:01.121799 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.121763 2575 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" podStartSLOduration=1.121739454 podStartE2EDuration="1.121739454s" podCreationTimestamp="2026-04-22 14:39:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:39:01.120840608 +0000 UTC m=+1416.052219511" watchObservedRunningTime="2026-04-22 14:39:01.121739454 +0000 UTC m=+1416.053118357" Apr 22 14:39:01.437148 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.437122 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-54d7ffcdcc-8rpqd_86a84579-b5c4-4078-a90a-5f4a6668d1a0/registry/0.log" Apr 22 14:39:01.474438 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:01.474419 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-97p4n_058163d3-0e8a-40f7-aaa3-382fc9d4f5d4/node-ca/0.log" Apr 22 14:39:02.455168 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:02.455136 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-hq58r_9971a5a9-34ef-4f3c-9183-340e4c5fde1c/serve-healthcheck-canary/0.log" Apr 22 14:39:02.972048 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:02.972021 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tmlr9_925cc614-5f91-4c68-af91-1ddc2bac16bc/kube-rbac-proxy/0.log" Apr 22 14:39:02.992372 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:02.992350 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tmlr9_925cc614-5f91-4c68-af91-1ddc2bac16bc/exporter/0.log" Apr 22 14:39:03.017288 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:03.017268 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-tmlr9_925cc614-5f91-4c68-af91-1ddc2bac16bc/extractor/0.log" Apr 22 14:39:04.911078 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:04.911048 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-66cf78b85b-pfht4_5d0c833d-c2c9-4b9d-bdcf-3d47aaf26f9a/manager/0.log" Apr 22 14:39:04.929158 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:04.929137 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-68cc5db7c4-8c95w_d9e7df5a-959a-480f-a726-48fec9e85ab0/manager/0.log" Apr 22 14:39:07.118034 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:07.118008 2575 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-bgl78/perf-node-gather-daemonset-65wsv" Apr 22 14:39:09.746648 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:09.746615 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-24sl8_c9b4b54a-3a19-409c-818f-f465ef373376/kube-multus/0.log" Apr 22 14:39:10.071187 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.071120 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/kube-multus-additional-cni-plugins/0.log" Apr 22 14:39:10.091322 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.091302 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/egress-router-binary-copy/0.log" Apr 22 14:39:10.113069 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.113050 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/cni-plugins/0.log" Apr 22 14:39:10.134746 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.134724 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/bond-cni-plugin/0.log" Apr 22 14:39:10.159821 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.159792 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/routeoverride-cni/0.log" Apr 22 14:39:10.182991 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.182971 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/whereabouts-cni-bincopy/0.log" Apr 22 14:39:10.203735 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.203712 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-tn9p9_766b2141-267e-41ed-bc88-fc000f360c08/whereabouts-cni/0.log" Apr 22 14:39:10.389669 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.389585 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-swv2n_1faf2ada-1177-442f-9ee9-4ecd9697e349/network-metrics-daemon/0.log" Apr 22 14:39:10.429251 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:10.429232 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-swv2n_1faf2ada-1177-442f-9ee9-4ecd9697e349/kube-rbac-proxy/0.log" Apr 22 14:39:11.950573 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:11.950547 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-controller/0.log" Apr 22 14:39:11.968103 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:11.968079 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/0.log" Apr 22 14:39:11.974732 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:11.974715 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovn-acl-logging/1.log" Apr 22 14:39:11.994620 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:11.994598 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/kube-rbac-proxy-node/0.log" Apr 22 14:39:12.017355 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:12.017334 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:39:12.034951 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:12.034934 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/northd/0.log" Apr 22 14:39:12.057956 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:12.057940 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/nbdb/0.log" Apr 22 14:39:12.079007 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:12.078992 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/sbdb/0.log" Apr 22 14:39:12.178942 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:12.178909 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-j58wd_3403a015-2d45-42e8-bf6e-9a0bc6d91e99/ovnkube-controller/0.log" Apr 22 14:39:13.123385 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:13.123312 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-source-8894fc9bd-4vkql_fc27862a-fe8f-4d00-8591-85e1878bef5a/check-endpoints/0.log" Apr 22 14:39:13.187162 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:13.187134 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-zvxdk_25952960-59a7-4c77-9fc4-71e746c78539/network-check-target-container/0.log" Apr 22 14:39:13.985835 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:13.985807 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-tc7f7_8c828a22-de6c-4a15-a273-d749ea26c601/iptables-alerter/0.log" Apr 22 14:39:14.618135 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:14.618108 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-2b68j_7dfdfa3e-08a0-4ac8-89e1-2cbf687b5329/tuned/0.log" Apr 22 14:39:16.257959 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:16.257929 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-gcj4r_bd3338c3-d102-49e4-905b-c457dea46629/cluster-samples-operator/0.log" Apr 22 14:39:16.272600 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:16.272575 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-6dc5bdb6b4-gcj4r_bd3338c3-d102-49e4-905b-c457dea46629/cluster-samples-operator-watch/0.log" Apr 22 14:39:17.248689 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:17.248613 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-9zbk9_f4c9f80e-9e6b-4087-b460-c87423e02659/service-ca-operator/1.log" Apr 22 14:39:17.249560 ip-10-0-130-98 kubenswrapper[2575]: I0422 14:39:17.249545 2575 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-service-ca-operator_service-ca-operator-d6fc45fc5-9zbk9_f4c9f80e-9e6b-4087-b460-c87423e02659/service-ca-operator/0.log"