Apr 20 19:20:36.279250 ip-10-0-129-98 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:20:36.652269 ip-10-0-129-98 kubenswrapper[2564]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:36.652269 ip-10-0-129-98 kubenswrapper[2564]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:20:36.652269 ip-10-0-129-98 kubenswrapper[2564]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:36.652269 ip-10-0-129-98 kubenswrapper[2564]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:20:36.652269 ip-10-0-129-98 kubenswrapper[2564]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:20:36.655195 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.655127 2564 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:20:36.658766 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658751 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:36.658766 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658766 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658770 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658774 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658777 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658780 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658783 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658786 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658790 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658792 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658795 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658798 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658801 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658803 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658806 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658822 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658825 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658828 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658830 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658833 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658836 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:36.658831 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658839 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658842 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658845 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658848 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658851 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658853 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658856 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658860 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658863 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658867 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658869 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658872 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658877 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658881 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658885 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658888 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658890 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658893 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658895 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:36.659323 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658897 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658900 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658902 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658905 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658907 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658909 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658912 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658915 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658922 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658925 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658928 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658930 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658932 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658935 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658937 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658941 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658944 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658946 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658949 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658951 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:36.659774 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658954 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658957 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658959 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658962 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658964 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658967 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658970 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658972 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658974 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658977 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658979 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658982 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658986 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.658988 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659002 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659006 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659011 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659015 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659018 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659023 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:36.660318 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659027 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659037 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659040 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659043 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659045 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659048 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659465 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659470 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659473 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659475 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659478 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659480 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659483 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659485 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659488 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659490 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659493 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659497 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659500 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659503 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:36.660797 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659506 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659508 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659511 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659513 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659516 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659519 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659521 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659524 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659526 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659529 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659532 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659534 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659537 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659545 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659548 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659551 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659554 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659556 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659558 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659561 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:36.661415 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659564 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659566 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659569 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659571 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659574 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659576 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659579 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659581 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659583 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659586 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659588 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659591 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659593 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659596 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659598 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659601 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659604 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659606 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659608 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659611 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:36.662210 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659613 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659616 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659618 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659621 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659623 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659625 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659633 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659637 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659641 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659645 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659648 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659650 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659653 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659656 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659658 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659661 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659664 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659666 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659668 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:36.662706 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659671 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659673 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659679 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659682 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659684 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659687 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659689 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659692 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659694 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659697 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659700 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659702 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.659704 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660365 2564 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660374 2564 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660384 2564 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660389 2564 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660393 2564 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660396 2564 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660401 2564 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660412 2564 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:20:36.663198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660416 2564 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660419 2564 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660423 2564 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660426 2564 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660429 2564 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660432 2564 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660435 2564 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660438 2564 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660441 2564 flags.go:64] FLAG: --cloud-config="" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660444 2564 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660447 2564 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660453 2564 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660456 2564 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660459 2564 flags.go:64] FLAG: --config-dir="" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660462 2564 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660465 2564 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660469 2564 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660472 2564 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660475 2564 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660478 2564 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660481 2564 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660484 2564 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660487 2564 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660490 2564 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660493 2564 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:20:36.663717 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660497 2564 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660500 2564 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660503 2564 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660506 2564 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660509 2564 flags.go:64] FLAG: --enable-server="true" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660512 2564 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660520 2564 flags.go:64] FLAG: --event-burst="100" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660530 2564 flags.go:64] FLAG: --event-qps="50" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660533 2564 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660537 2564 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660540 2564 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660544 2564 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660547 2564 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660550 2564 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660553 2564 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660556 2564 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660559 2564 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660561 2564 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660564 2564 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660567 2564 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660570 2564 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660573 2564 flags.go:64] FLAG: --feature-gates="" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660577 2564 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660580 2564 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660583 2564 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:20:36.664344 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660587 2564 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660590 2564 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660606 2564 flags.go:64] FLAG: --help="false" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660609 2564 flags.go:64] FLAG: --hostname-override="ip-10-0-129-98.ec2.internal" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660613 2564 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660616 2564 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660619 2564 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660622 2564 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660626 2564 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660629 2564 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660632 2564 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660635 2564 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660637 2564 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660640 2564 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660643 2564 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660651 2564 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660655 2564 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660658 2564 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660661 2564 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660664 2564 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660667 2564 flags.go:64] FLAG: --lock-file="" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660670 2564 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660672 2564 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660675 2564 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:20:36.664958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660685 2564 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660688 2564 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660691 2564 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660694 2564 flags.go:64] FLAG: --logging-format="text" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660697 2564 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660700 2564 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660703 2564 flags.go:64] FLAG: --manifest-url="" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660706 2564 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660711 2564 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660714 2564 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660718 2564 flags.go:64] FLAG: --max-pods="110" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660721 2564 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660724 2564 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660727 2564 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660730 2564 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660733 2564 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660736 2564 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660738 2564 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660746 2564 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660749 2564 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660752 2564 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660755 2564 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660758 2564 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:20:36.665604 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660764 2564 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660772 2564 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660776 2564 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660779 2564 flags.go:64] FLAG: --port="10250" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660782 2564 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660785 2564 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e079c5cdd7f9835c" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660788 2564 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660791 2564 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660794 2564 flags.go:64] FLAG: --register-node="true" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660797 2564 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660800 2564 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660804 2564 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660807 2564 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660810 2564 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660813 2564 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660817 2564 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660820 2564 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660823 2564 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660826 2564 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660828 2564 flags.go:64] FLAG: --runonce="false" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660831 2564 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660834 2564 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660837 2564 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660840 2564 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660843 2564 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660846 2564 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:20:36.666165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660849 2564 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660852 2564 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660854 2564 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660857 2564 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660860 2564 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660863 2564 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660866 2564 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660870 2564 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660879 2564 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660884 2564 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660887 2564 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660890 2564 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660896 2564 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660899 2564 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660902 2564 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660905 2564 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660918 2564 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660921 2564 flags.go:64] FLAG: --v="2" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660925 2564 flags.go:64] FLAG: --version="false" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660929 2564 flags.go:64] FLAG: --vmodule="" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660938 2564 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.660942 2564 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661068 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661072 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:36.666781 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661076 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661080 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661083 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661086 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661092 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661095 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661098 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661100 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661103 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661106 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661109 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661111 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661114 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661117 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661119 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661123 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661126 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661134 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661137 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:36.667374 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661140 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661142 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661145 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661147 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661150 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661153 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661155 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661158 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661161 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661163 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661165 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661168 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661170 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661173 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661176 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661178 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661182 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661187 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661190 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661193 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:36.667864 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661196 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661198 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661201 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661203 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661206 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661209 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661211 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661213 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661218 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661220 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661223 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661232 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661234 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661237 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661239 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661242 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661244 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661247 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661249 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661252 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:36.668367 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661254 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661257 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661259 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661262 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661264 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661267 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661269 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661272 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661274 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661278 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661280 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661283 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661286 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661288 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661291 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661293 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661296 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661299 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661301 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661304 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:36.668892 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661310 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:36.669398 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661313 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:36.669398 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661316 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:36.669398 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661318 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:36.669398 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.661326 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:36.669398 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.661335 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:36.670300 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.670279 2564 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:20:36.670300 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.670300 2564 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:20:36.670364 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670349 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:36.670364 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670354 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:36.670364 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670357 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:36.670364 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670361 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:36.670364 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670363 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670368 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670372 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670374 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670377 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670380 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670383 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670385 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670388 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670391 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670393 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670396 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670398 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670401 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670404 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670406 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670408 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670411 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670413 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670416 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:36.670492 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670419 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670421 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670424 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670427 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670429 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670432 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670435 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670438 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670440 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670443 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670445 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670447 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670450 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670453 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670456 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670459 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670462 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670464 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670467 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:36.670978 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670469 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670472 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670474 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670477 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670480 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670482 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670485 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670487 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670489 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670492 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670495 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670497 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670500 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670503 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670505 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670508 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670510 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670512 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670515 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670517 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:36.671455 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670520 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670524 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670529 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670532 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670535 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670539 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670542 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670545 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670548 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670550 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670553 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670555 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670558 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670560 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670563 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670565 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670568 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670570 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670573 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670576 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:36.671931 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670578 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670580 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670583 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.670588 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670678 2564 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670683 2564 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670686 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670689 2564 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670691 2564 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670694 2564 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670697 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670700 2564 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670703 2564 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670705 2564 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670708 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:20:36.672436 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670711 2564 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670713 2564 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670716 2564 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670718 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670721 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670724 2564 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670727 2564 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670729 2564 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670732 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670734 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670737 2564 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670740 2564 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670742 2564 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670745 2564 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670747 2564 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670750 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670752 2564 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670755 2564 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670757 2564 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670760 2564 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:20:36.672802 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670762 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670765 2564 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670767 2564 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670770 2564 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670772 2564 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670774 2564 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670777 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670779 2564 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670782 2564 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670784 2564 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670786 2564 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670789 2564 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670791 2564 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670794 2564 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670797 2564 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670800 2564 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670802 2564 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670805 2564 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670808 2564 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670810 2564 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:20:36.673314 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670813 2564 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670815 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670818 2564 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670821 2564 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670824 2564 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670826 2564 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670829 2564 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670831 2564 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670834 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670836 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670839 2564 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670841 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670844 2564 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670846 2564 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670849 2564 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670851 2564 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670854 2564 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670856 2564 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670859 2564 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670862 2564 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:20:36.673806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670864 2564 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670867 2564 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670869 2564 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670872 2564 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670874 2564 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670877 2564 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670881 2564 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670884 2564 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670887 2564 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670890 2564 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670893 2564 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670896 2564 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670900 2564 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670903 2564 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:36.670906 2564 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:20:36.674312 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.670911 2564 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:20:36.674671 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.671551 2564 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:20:36.674671 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.674274 2564 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:20:36.675093 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.675082 2564 server.go:1019] "Starting client certificate rotation" Apr 20 19:20:36.675220 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.675203 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:20:36.675252 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.675243 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:20:36.697021 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.696985 2564 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:20:36.698663 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.698643 2564 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:20:36.710231 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.710212 2564 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:20:36.715977 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.715961 2564 log.go:25] "Validated CRI v1 image API" Apr 20 19:20:36.717242 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.717209 2564 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:20:36.719438 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.719420 2564 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 886029f6-4b4d-41d6-a794-678e07945204:/dev/nvme0n1p4 8ca925a9-54bb-47b6-9071-c4f1de33392d:/dev/nvme0n1p3] Apr 20 19:20:36.719507 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.719437 2564 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:20:36.724162 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.724049 2564 manager.go:217] Machine: {Timestamp:2026-04-20 19:20:36.723047644 +0000 UTC m=+0.342729922 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3102623 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2f51897e0fe887ed4daf03e91c6d27 SystemUUID:ec2f5189-7e0f-e887-ed4d-af03e91c6d27 BootID:444f83dd-484f-4196-b1ea-8410599910cf Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:63:86:5b:de:d1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:63:86:5b:de:d1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:fe:23:6d:83:6a Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:20:36.724803 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.724793 2564 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:20:36.724883 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.724871 2564 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:20:36.725863 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.725840 2564 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:20:36.726004 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.725866 2564 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-129-98.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:20:36.726057 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.726015 2564 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:20:36.726057 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.726025 2564 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:20:36.726057 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.726038 2564 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:20:36.726057 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.726053 2564 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:20:36.727369 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.727359 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:20:36.727600 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.727590 2564 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:20:36.729048 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.729032 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:20:36.729829 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.729818 2564 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:20:36.729869 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.729833 2564 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:20:36.730429 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.730420 2564 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:20:36.730461 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.730433 2564 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:20:36.730461 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.730442 2564 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:20:36.731415 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.731404 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:20:36.731462 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.731422 2564 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:20:36.734248 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.734232 2564 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:20:36.735529 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.735516 2564 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:20:36.736769 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736757 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736774 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736781 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736787 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736793 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736799 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736804 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736810 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:20:36.736819 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736818 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:20:36.737047 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736824 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:20:36.737047 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736841 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:20:36.737047 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.736850 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:20:36.737577 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.737566 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:20:36.737577 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.737576 2564 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:20:36.740848 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.740834 2564 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:20:36.740931 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.740869 2564 server.go:1295] "Started kubelet" Apr 20 19:20:36.741528 ip-10-0-129-98 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:20:36.741644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.741511 2564 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:20:36.741644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.741591 2564 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:20:36.741644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.741538 2564 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:20:36.742730 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.742710 2564 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:20:36.743140 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.743117 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-129-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:20:36.743233 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.743214 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:20:36.743233 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.743225 2564 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-129-98.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:20:36.744219 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.744204 2564 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:20:36.748642 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.748620 2564 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:20:36.749048 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.749032 2564 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:20:36.750765 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.750625 2564 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:20:36.750765 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.750649 2564 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:20:36.750765 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.750732 2564 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:20:36.750935 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.750775 2564 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:20:36.750935 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.750783 2564 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:20:36.751032 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.750952 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:36.751076 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.751059 2564 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 19:20:36.751359 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751344 2564 factory.go:153] Registering CRI-O factory Apr 20 19:20:36.751455 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751369 2564 factory.go:223] Registration of the crio container factory successfully Apr 20 19:20:36.751455 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751419 2564 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:20:36.751455 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751428 2564 factory.go:55] Registering systemd factory Apr 20 19:20:36.751455 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751433 2564 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:20:36.751455 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751449 2564 factory.go:103] Registering Raw factory Apr 20 19:20:36.751650 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.751461 2564 manager.go:1196] Started watching for new ooms in manager Apr 20 19:20:36.751650 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.750477 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-98.ec2.internal.18a826e95dd09191 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-98.ec2.internal,UID:ip-10-0-129-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-129-98.ec2.internal,},FirstTimestamp:2026-04-20 19:20:36.740845969 +0000 UTC m=+0.360528250,LastTimestamp:2026-04-20 19:20:36.740845969 +0000 UTC m=+0.360528250,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-98.ec2.internal,}" Apr 20 19:20:36.752262 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.752248 2564 manager.go:319] Starting recovery of all containers Apr 20 19:20:36.752777 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.752743 2564 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:20:36.752900 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.752879 2564 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-129-98.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 19:20:36.761105 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.760962 2564 manager.go:324] Recovery completed Apr 20 19:20:36.765074 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.765060 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:36.767454 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.767431 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:36.767530 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.767465 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:36.767530 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.767475 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:36.767916 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.767904 2564 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:20:36.767916 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.767916 2564 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:20:36.767987 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.767930 2564 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:20:36.769541 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.769482 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-98.ec2.internal.18a826e95f66902d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-98.ec2.internal,UID:ip-10-0-129-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-129-98.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-129-98.ec2.internal,},FirstTimestamp:2026-04-20 19:20:36.767453229 +0000 UTC m=+0.387135506,LastTimestamp:2026-04-20 19:20:36.767453229 +0000 UTC m=+0.387135506,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-98.ec2.internal,}" Apr 20 19:20:36.769911 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.769900 2564 policy_none.go:49] "None policy: Start" Apr 20 19:20:36.769953 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.769924 2564 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:20:36.769953 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.769934 2564 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:20:36.778478 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.778419 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-98.ec2.internal.18a826e95f66d1f9 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-98.ec2.internal,UID:ip-10-0-129-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node ip-10-0-129-98.ec2.internal status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:ip-10-0-129-98.ec2.internal,},FirstTimestamp:2026-04-20 19:20:36.767470073 +0000 UTC m=+0.387152351,LastTimestamp:2026-04-20 19:20:36.767470073 +0000 UTC m=+0.387152351,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-98.ec2.internal,}" Apr 20 19:20:36.786400 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.786342 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-98.ec2.internal.18a826e95f66f458 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-98.ec2.internal,UID:ip-10-0-129-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node ip-10-0-129-98.ec2.internal status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:ip-10-0-129-98.ec2.internal,},FirstTimestamp:2026-04-20 19:20:36.767478872 +0000 UTC m=+0.387161150,LastTimestamp:2026-04-20 19:20:36.767478872 +0000 UTC m=+0.387161150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-98.ec2.internal,}" Apr 20 19:20:36.805536 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805520 2564 manager.go:341] "Starting Device Plugin manager" Apr 20 19:20:36.805602 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.805556 2564 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:20:36.805602 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805568 2564 server.go:85] "Starting device plugin registration server" Apr 20 19:20:36.805813 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805798 2564 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:20:36.805855 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805819 2564 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:20:36.805903 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805891 2564 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:20:36.806291 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805971 2564 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:20:36.806291 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.805984 2564 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:20:36.820986 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.806399 2564 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:20:36.820986 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.806438 2564 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:36.820986 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.819296 2564 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-129-98.ec2.internal.18a826e961cb64d8 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-129-98.ec2.internal,UID:ip-10-0-129-98.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:ip-10-0-129-98.ec2.internal,},FirstTimestamp:2026-04-20 19:20:36.807615704 +0000 UTC m=+0.427297972,LastTimestamp:2026-04-20 19:20:36.807615704 +0000 UTC m=+0.427297972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-129-98.ec2.internal,}" Apr 20 19:20:36.847011 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.846977 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b2ztv" Apr 20 19:20:36.852936 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.852917 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-b2ztv" Apr 20 19:20:36.878574 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.878550 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:20:36.879816 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.879792 2564 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:20:36.880372 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.880361 2564 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:20:36.880450 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.880387 2564 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:20:36.880450 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.880397 2564 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:20:36.880450 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.880432 2564 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:20:36.891133 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.891109 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:36.906061 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.906023 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:36.906754 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.906733 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:36.906818 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.906757 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:36.906818 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.906767 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:36.906818 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.906787 2564 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-129-98.ec2.internal" Apr 20 19:20:36.918921 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.918904 2564 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-129-98.ec2.internal" Apr 20 19:20:36.918971 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.918922 2564 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-129-98.ec2.internal\": node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:36.936646 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:36.936625 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:36.980841 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.980816 2564 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal"] Apr 20 19:20:36.980910 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.980875 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:36.987368 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.987351 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:36.987440 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.987376 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:36.987440 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.987390 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:36.988726 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.988715 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:36.988859 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.988846 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:36.988897 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.988872 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:36.991252 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.991235 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:36.991252 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.991250 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:36.991403 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.991265 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:36.991403 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.991273 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:36.991403 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.991278 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:36.991403 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.991283 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:36.992459 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.992443 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 20 19:20:36.992551 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.992464 2564 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:20:36.993595 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.993581 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:20:36.993682 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.993606 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:20:36.993682 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:36.993617 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:20:37.012589 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.012570 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-98.ec2.internal\" not found" node="ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.016908 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.016893 2564 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-129-98.ec2.internal\" not found" node="ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.036692 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.036664 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.051988 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.051963 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.052054 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.052008 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.052054 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.052033 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87dc53c55f73620bf5df44e2826c141e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-98.ec2.internal\" (UID: \"87dc53c55f73620bf5df44e2826c141e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.137712 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.137693 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.152588 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.152569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.152644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.152596 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.152644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.152612 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87dc53c55f73620bf5df44e2826c141e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-98.ec2.internal\" (UID: \"87dc53c55f73620bf5df44e2826c141e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.152644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.152643 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/87dc53c55f73620bf5df44e2826c141e-config\") pod \"kube-apiserver-proxy-ip-10-0-129-98.ec2.internal\" (UID: \"87dc53c55f73620bf5df44e2826c141e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.152751 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.152659 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.152751 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.152666 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/32cc11a6fe1288d8e923d33bdeaf02c1-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal\" (UID: \"32cc11a6fe1288d8e923d33bdeaf02c1\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.238361 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.238306 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.315841 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.315813 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.319227 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.319211 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 20 19:20:37.338802 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.338778 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.439338 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.439313 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.539929 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.539880 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.640435 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.640411 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.674988 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.674971 2564 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:20:37.675550 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.675109 2564 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:20:37.741054 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.741032 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.748857 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.748837 2564 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:20:37.766322 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.766304 2564 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:20:37.787541 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.787521 2564 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-trrfr" Apr 20 19:20:37.790294 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:37.790244 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87dc53c55f73620bf5df44e2826c141e.slice/crio-faa4f794d14bc0aacee6d54ca6d2147b819cc9d35bd82da31035c066d6755f04 WatchSource:0}: Error finding container faa4f794d14bc0aacee6d54ca6d2147b819cc9d35bd82da31035c066d6755f04: Status 404 returned error can't find the container with id faa4f794d14bc0aacee6d54ca6d2147b819cc9d35bd82da31035c066d6755f04 Apr 20 19:20:37.790589 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:37.790568 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32cc11a6fe1288d8e923d33bdeaf02c1.slice/crio-6c443cb6a407717af72099eaac86ff408fc5923772c6aa4ff07521e3f3f26101 WatchSource:0}: Error finding container 6c443cb6a407717af72099eaac86ff408fc5923772c6aa4ff07521e3f3f26101: Status 404 returned error can't find the container with id 6c443cb6a407717af72099eaac86ff408fc5923772c6aa4ff07521e3f3f26101 Apr 20 19:20:37.794189 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.794176 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:20:37.794736 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.794719 2564 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-trrfr" Apr 20 19:20:37.841143 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.841121 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:37.854569 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.854545 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:15:36 +0000 UTC" deadline="2027-12-10 04:49:33.732810366 +0000 UTC" Apr 20 19:20:37.854569 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.854564 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14361h28m55.878248817s" Apr 20 19:20:37.883512 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.883470 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" event={"ID":"87dc53c55f73620bf5df44e2826c141e","Type":"ContainerStarted","Data":"faa4f794d14bc0aacee6d54ca6d2147b819cc9d35bd82da31035c066d6755f04"} Apr 20 19:20:37.884397 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.884373 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" event={"ID":"32cc11a6fe1288d8e923d33bdeaf02c1","Type":"ContainerStarted","Data":"6c443cb6a407717af72099eaac86ff408fc5923772c6aa4ff07521e3f3f26101"} Apr 20 19:20:37.897642 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:37.897622 2564 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:37.941585 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:37.941562 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:38.031526 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.031505 2564 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:38.042612 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.042572 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:38.143118 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.143098 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:38.243930 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.243904 2564 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-129-98.ec2.internal\" not found" Apr 20 19:20:38.267235 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.267209 2564 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:38.349547 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.349486 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" Apr 20 19:20:38.366974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.366949 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:20:38.367116 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.367102 2564 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" Apr 20 19:20:38.386711 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.386617 2564 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:20:38.731119 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.731056 2564 apiserver.go:52] "Watching apiserver" Apr 20 19:20:38.739905 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.739878 2564 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:20:38.740292 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.740267 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4","openshift-image-registry/node-ca-5c7r9","openshift-multus/multus-62ns8","openshift-multus/multus-additional-cni-plugins-7stt2","openshift-multus/network-metrics-daemon-zdlvd","openshift-network-diagnostics/network-check-target-7nwp8","openshift-network-operator/iptables-alerter-sljxr","kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal","openshift-cluster-node-tuning-operator/tuned-zf7kf","openshift-dns/node-resolver-6rnf5","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal","openshift-ovn-kubernetes/ovnkube-node-f5z9f","kube-system/konnectivity-agent-nqj8h"] Apr 20 19:20:38.741569 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.741551 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.743647 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.743621 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:38.743753 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.743717 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:38.743822 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.743809 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.744234 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.744014 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sqxxc\"" Apr 20 19:20:38.744234 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.744062 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.744663 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.744565 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:38.744663 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.744633 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.744816 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.744630 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:38.747664 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.747301 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.747664 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.747413 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.747805 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.747701 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:20:38.748347 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.748324 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-w4wnq\"" Apr 20 19:20:38.749248 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.749231 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.750535 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.750314 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.751431 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.751412 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.751524 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.751445 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.751524 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.751508 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:20:38.751627 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.751529 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.751940 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.751921 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7lzxz\"" Apr 20 19:20:38.752066 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.752044 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.752756 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.752738 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:20:38.752832 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.752782 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.752962 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.752933 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.752962 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.752942 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-hblxf\"" Apr 20 19:20:38.753799 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.753782 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.753799 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.753791 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:20:38.754568 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754137 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.754568 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754193 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.754568 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754220 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-kmwq7\"" Apr 20 19:20:38.754568 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754263 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.754568 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754296 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-wgwwp\"" Apr 20 19:20:38.754835 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754591 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:20:38.754835 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754817 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:20:38.754922 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.754847 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:20:38.755564 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.755547 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.756712 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.756682 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.756712 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.756703 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:20:38.756866 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.756746 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:20:38.756866 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.756746 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.757082 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.757056 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-6gmb5\"" Apr 20 19:20:38.757335 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.757316 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-s26cl\"" Apr 20 19:20:38.757776 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.757758 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:20:38.758034 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.758013 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:20:38.758130 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.758078 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:20:38.758261 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.758245 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-d95rz\"" Apr 20 19:20:38.758261 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.758257 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:20:38.758502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.758439 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:20:38.758502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.758474 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:20:38.759884 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.759865 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-k8s-cni-cncf-io\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.759964 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.759899 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-cni-bin\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.759964 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.759923 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:38.759964 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.759949 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cni-binary-copy\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.760122 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760003 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-registration-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.760122 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760041 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-device-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.760122 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760080 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bde3fba3-4508-40ec-89b5-45e19b836b5e-iptables-alerter-script\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.760122 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760107 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q54jf\" (UniqueName: \"kubernetes.io/projected/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-kube-api-access-q54jf\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.760304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760147 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-etc-selinux\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.760304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760182 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-os-release\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760208 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-socket-dir-parent\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760234 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-etc-kubernetes\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760276 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbjph\" (UniqueName: \"kubernetes.io/projected/bde3fba3-4508-40ec-89b5-45e19b836b5e-kube-api-access-dbjph\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760305 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f57b12eb-90dc-43f3-b677-c16555487307-serviceca\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760329 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65h7p\" (UniqueName: \"kubernetes.io/projected/f57b12eb-90dc-43f3-b677-c16555487307-kube-api-access-65h7p\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760355 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760400 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-kubelet\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760432 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-multus-certs\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760459 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2vb8w\" (UniqueName: \"kubernetes.io/projected/69310308-59c8-4043-9117-c0e3a4104e6e-kube-api-access-2vb8w\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760482 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrpm\" (UniqueName: \"kubernetes.io/projected/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-kube-api-access-vjrpm\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760542 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760568 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-system-cni-dir\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760593 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-os-release\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760618 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760676 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-cni-multus\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760698 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f57b12eb-90dc-43f3-b677-c16555487307-host\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.760759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760723 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760797 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-socket-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760824 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zprf9\" (UniqueName: \"kubernetes.io/projected/0476b44e-b1ce-4787-8794-c21ac774e74d-kube-api-access-zprf9\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760851 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-system-cni-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760873 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-cni-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760895 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69310308-59c8-4043-9117-c0e3a4104e6e-cni-binary-copy\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760919 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-netns\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.760956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760940 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-hostroot\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.760969 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-conf-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761030 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-hosts-file\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761054 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-tmp-dir\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761077 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5qmr\" (UniqueName: \"kubernetes.io/projected/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-kube-api-access-l5qmr\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761121 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bde3fba3-4508-40ec-89b5-45e19b836b5e-host-slash\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761144 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-sys-fs\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761167 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-cnibin\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.761207 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761188 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69310308-59c8-4043-9117-c0e3a4104e6e-multus-daemon-config\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.761448 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761211 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cnibin\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.761448 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.761240 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.795959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.795936 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:15:37 +0000 UTC" deadline="2027-09-26 13:42:26.05315389 +0000 UTC" Apr 20 19:20:38.795959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.795959 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12570h21m47.257197721s" Apr 20 19:20:38.852231 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.852212 2564 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:20:38.861601 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861578 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65h7p\" (UniqueName: \"kubernetes.io/projected/f57b12eb-90dc-43f3-b677-c16555487307-kube-api-access-65h7p\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.861705 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.861705 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861635 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cf05dbd-a831-4450-baf1-a340e0113d84-tmp\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.861705 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861659 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-log-socket\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.861705 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861701 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861725 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-os-release\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861749 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-modprobe-d\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861771 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysctl-conf\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861791 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-tuning-conf-dir\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861811 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-slash\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.861839 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861848 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861851 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-os-release\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.861913 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861889 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-ovn\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.861921 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.361891156 +0000 UTC m=+2.981573424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.861961 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862022 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-cni-netd\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862044 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovn-node-metrics-cert\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862061 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovnkube-script-lib\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862079 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-system-cni-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862127 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69310308-59c8-4043-9117-c0e3a4104e6e-cni-binary-copy\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862163 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-netns\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862135 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-system-cni-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862187 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-hostroot\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862212 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-hosts-file\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862236 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-tmp-dir\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862238 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-netns\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862263 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-kubernetes\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862312 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-hosts-file\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.862336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862304 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-systemd\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862364 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862395 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e6858c48-14e7-4cc5-a1cd-0554a465f2db-agent-certs\") pod \"konnectivity-agent-nqj8h\" (UID: \"e6858c48-14e7-4cc5-a1cd-0554a465f2db\") " pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862324 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-hostroot\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862422 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysconfig\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862446 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862471 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-cni-bin\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862496 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxqpc\" (UniqueName: \"kubernetes.io/projected/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-kube-api-access-cxqpc\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862520 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysctl-d\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862532 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-tmp-dir\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862548 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-k8s-cni-cncf-io\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862584 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862588 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-k8s-cni-cncf-io\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862610 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cni-binary-copy\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862631 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/69310308-59c8-4043-9117-c0e3a4104e6e-cni-binary-copy\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862637 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-registration-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862694 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-registration-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.863087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862725 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e6858c48-14e7-4cc5-a1cd-0554a465f2db-konnectivity-ca\") pod \"konnectivity-agent-nqj8h\" (UID: \"e6858c48-14e7-4cc5-a1cd-0554a465f2db\") " pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862751 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovnkube-config\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bde3fba3-4508-40ec-89b5-45e19b836b5e-iptables-alerter-script\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862802 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-node-log\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862828 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-env-overrides\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862871 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-os-release\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862898 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-socket-dir-parent\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862922 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-etc-kubernetes\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862924 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.862947 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dbjph\" (UniqueName: \"kubernetes.io/projected/bde3fba3-4508-40ec-89b5-45e19b836b5e-kube-api-access-dbjph\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863025 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-os-release\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863058 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f57b12eb-90dc-43f3-b677-c16555487307-serviceca\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863078 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cni-binary-copy\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863089 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-tuned\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863124 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh87n\" (UniqueName: \"kubernetes.io/projected/4cf05dbd-a831-4450-baf1-a340e0113d84-kube-api-access-hh87n\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863148 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-kubelet\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863175 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-kubelet\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.863909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863202 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/bde3fba3-4508-40ec-89b5-45e19b836b5e-iptables-alerter-script\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863214 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-multus-certs\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863219 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-etc-kubernetes\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863241 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2vb8w\" (UniqueName: \"kubernetes.io/projected/69310308-59c8-4043-9117-c0e3a4104e6e-kube-api-access-2vb8w\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863258 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-kubelet\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863267 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrpm\" (UniqueName: \"kubernetes.io/projected/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-kube-api-access-vjrpm\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863280 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-socket-dir-parent\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863310 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-run-multus-certs\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863332 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-system-cni-dir\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863364 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863403 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-socket-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863434 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-system-cni-dir\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863450 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/f57b12eb-90dc-43f3-b677-c16555487307-serviceca\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863491 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-run\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863522 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-socket-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-cni-multus\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863544 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f57b12eb-90dc-43f3-b677-c16555487307-host\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.864520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863556 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-cni-multus\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863600 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863602 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zprf9\" (UniqueName: \"kubernetes.io/projected/0476b44e-b1ce-4787-8794-c21ac774e74d-kube-api-access-zprf9\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863640 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/f57b12eb-90dc-43f3-b677-c16555487307-host\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863650 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-systemd\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863758 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-sys\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863780 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863802 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-var-lib-kubelet\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863836 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-systemd-units\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863859 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-cni-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863876 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-conf-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863901 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l5qmr\" (UniqueName: \"kubernetes.io/projected/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-kube-api-access-l5qmr\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863929 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-sys-fs\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863951 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-cni-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863955 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-host\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863955 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-multus-conf-dir\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.863987 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-run-netns\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864068 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-sys-fs\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864107 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bde3fba3-4508-40ec-89b5-45e19b836b5e-host-slash\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864135 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-var-lib-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864155 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bde3fba3-4508-40ec-89b5-45e19b836b5e-host-slash\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864172 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-etc-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864196 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-cnibin\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864211 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69310308-59c8-4043-9117-c0e3a4104e6e-multus-daemon-config\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864230 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cnibin\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864257 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-lib-modules\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864283 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-cnibin\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864285 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-cni-bin\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864318 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-device-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864321 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/69310308-59c8-4043-9117-c0e3a4104e6e-host-var-lib-cni-bin\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864357 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-cnibin\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864364 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q54jf\" (UniqueName: \"kubernetes.io/projected/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-kube-api-access-q54jf\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864394 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-etc-selinux\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.865755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864404 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-device-dir\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.866502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864526 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/0476b44e-b1ce-4787-8794-c21ac774e74d-etc-selinux\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.866502 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.864609 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/69310308-59c8-4043-9117-c0e3a4104e6e-multus-daemon-config\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.871031 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.871008 2564 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:20:38.872700 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.872676 2564 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:20:38.873554 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.873311 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:38.873554 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.873335 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:38.873554 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.873350 2564 projected.go:194] Error preparing data for projected volume kube-api-access-xwbpg for pod openshift-network-diagnostics/network-check-target-7nwp8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:38.873554 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:38.873431 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg podName:4f25f47c-abed-4e2e-83fe-3edbfd02b4ff nodeName:}" failed. No retries permitted until 2026-04-20 19:20:39.373402662 +0000 UTC m=+2.993084945 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-xwbpg" (UniqueName: "kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg") pod "network-check-target-7nwp8" (UID: "4f25f47c-abed-4e2e-83fe-3edbfd02b4ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:38.876387 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.876365 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zprf9\" (UniqueName: \"kubernetes.io/projected/0476b44e-b1ce-4787-8794-c21ac774e74d-kube-api-access-zprf9\") pod \"aws-ebs-csi-driver-node-hklj4\" (UID: \"0476b44e-b1ce-4787-8794-c21ac774e74d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:38.876387 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.876379 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65h7p\" (UniqueName: \"kubernetes.io/projected/f57b12eb-90dc-43f3-b677-c16555487307-kube-api-access-65h7p\") pod \"node-ca-5c7r9\" (UID: \"f57b12eb-90dc-43f3-b677-c16555487307\") " pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:38.876562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.876388 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q54jf\" (UniqueName: \"kubernetes.io/projected/f331ebff-9be1-4254-b31d-7bcbbc5bbf98-kube-api-access-q54jf\") pod \"multus-additional-cni-plugins-7stt2\" (UID: \"f331ebff-9be1-4254-b31d-7bcbbc5bbf98\") " pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:38.876562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.876366 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrpm\" (UniqueName: \"kubernetes.io/projected/a09cd02b-dfa8-4f51-abdc-9e5a0b219e23-kube-api-access-vjrpm\") pod \"node-resolver-6rnf5\" (UID: \"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23\") " pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:38.876754 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.876731 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5qmr\" (UniqueName: \"kubernetes.io/projected/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-kube-api-access-l5qmr\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:38.876829 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.876793 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbjph\" (UniqueName: \"kubernetes.io/projected/bde3fba3-4508-40ec-89b5-45e19b836b5e-kube-api-access-dbjph\") pod \"iptables-alerter-sljxr\" (UID: \"bde3fba3-4508-40ec-89b5-45e19b836b5e\") " pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:38.877336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.877311 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2vb8w\" (UniqueName: \"kubernetes.io/projected/69310308-59c8-4043-9117-c0e3a4104e6e-kube-api-access-2vb8w\") pod \"multus-62ns8\" (UID: \"69310308-59c8-4043-9117-c0e3a4104e6e\") " pod="openshift-multus/multus-62ns8" Apr 20 19:20:38.964718 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964689 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cf05dbd-a831-4450-baf1-a340e0113d84-tmp\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964727 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-log-socket\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964762 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-modprobe-d\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964782 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysctl-conf\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-slash\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964822 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-log-socket\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964875 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-slash\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.964888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964881 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964829 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964930 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-ovn\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964958 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964971 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-modprobe-d\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.964984 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-cni-netd\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965021 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysctl-conf\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965034 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-cni-netd\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965040 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovn-node-metrics-cert\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965066 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-ovn\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965069 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovnkube-script-lib\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965072 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965116 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-kubernetes\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965141 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-systemd\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965167 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e6858c48-14e7-4cc5-a1cd-0554a465f2db-agent-certs\") pod \"konnectivity-agent-nqj8h\" (UID: \"e6858c48-14e7-4cc5-a1cd-0554a465f2db\") " pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965192 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysconfig\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965216 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965222 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965230 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-kubernetes\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965240 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-cni-bin\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965267 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxqpc\" (UniqueName: \"kubernetes.io/projected/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-kube-api-access-cxqpc\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965292 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysctl-d\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965333 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e6858c48-14e7-4cc5-a1cd-0554a465f2db-konnectivity-ca\") pod \"konnectivity-agent-nqj8h\" (UID: \"e6858c48-14e7-4cc5-a1cd-0554a465f2db\") " pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965359 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovnkube-config\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965385 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-node-log\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965408 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-env-overrides\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965446 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-tuned\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965468 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hh87n\" (UniqueName: \"kubernetes.io/projected/4cf05dbd-a831-4450-baf1-a340e0113d84-kube-api-access-hh87n\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965493 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-kubelet\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965527 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-run\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965558 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-systemd\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965584 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-sys\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965607 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-var-lib-kubelet\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965638 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-systemd-units\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965639 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovnkube-script-lib\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965670 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-host\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.965974 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965698 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-run-netns\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965702 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-run-systemd\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965721 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-var-lib-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965748 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-etc-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965776 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-lib-modules\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965944 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-lib-modules\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.965978 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-run-ovn-kubernetes\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966009 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-cni-bin\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966038 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-systemd-units\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966048 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysconfig\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966088 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-host\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966114 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-kubelet\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966139 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-host-run-netns\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966169 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-run\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966181 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-var-lib-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966241 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-var-lib-kubelet\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966244 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-node-log\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966294 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-etc-openvswitch\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.966834 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966307 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-sysctl-d\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.967728 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966338 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-sys\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.967728 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966352 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-systemd\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.967728 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966569 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovnkube-config\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.967728 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966614 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-env-overrides\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.967728 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.966809 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/e6858c48-14e7-4cc5-a1cd-0554a465f2db-konnectivity-ca\") pod \"konnectivity-agent-nqj8h\" (UID: \"e6858c48-14e7-4cc5-a1cd-0554a465f2db\") " pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.967728 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.967716 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4cf05dbd-a831-4450-baf1-a340e0113d84-tmp\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.968018 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.967804 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-ovn-node-metrics-cert\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:38.968520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.968500 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4cf05dbd-a831-4450-baf1-a340e0113d84-etc-tuned\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.969364 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.969342 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/e6858c48-14e7-4cc5-a1cd-0554a465f2db-agent-certs\") pod \"konnectivity-agent-nqj8h\" (UID: \"e6858c48-14e7-4cc5-a1cd-0554a465f2db\") " pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:38.973946 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.973925 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh87n\" (UniqueName: \"kubernetes.io/projected/4cf05dbd-a831-4450-baf1-a340e0113d84-kube-api-access-hh87n\") pod \"tuned-zf7kf\" (UID: \"4cf05dbd-a831-4450-baf1-a340e0113d84\") " pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:38.974197 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:38.974180 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxqpc\" (UniqueName: \"kubernetes.io/projected/e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7-kube-api-access-cxqpc\") pod \"ovnkube-node-f5z9f\" (UID: \"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7\") " pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:39.054215 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.054147 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-6rnf5" Apr 20 19:20:39.063097 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.063067 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-sljxr" Apr 20 19:20:39.071818 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.071798 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" Apr 20 19:20:39.076331 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.076311 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-5c7r9" Apr 20 19:20:39.082883 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.082863 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-62ns8" Apr 20 19:20:39.091441 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.091423 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-7stt2" Apr 20 19:20:39.097967 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.097949 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" Apr 20 19:20:39.103535 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.103516 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:20:39.109051 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.109034 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:39.346041 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.346009 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69310308_59c8_4043_9117_c0e3a4104e6e.slice/crio-0f995fe56b9eb560310b9d95a1f233f8a10943064c880dcb7836feabeb45e6aa WatchSource:0}: Error finding container 0f995fe56b9eb560310b9d95a1f233f8a10943064c880dcb7836feabeb45e6aa: Status 404 returned error can't find the container with id 0f995fe56b9eb560310b9d95a1f233f8a10943064c880dcb7836feabeb45e6aa Apr 20 19:20:39.349511 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.349484 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0fd7043_be2f_4ea6_8e8e_0c1ad9b57cf7.slice/crio-4d53e74f37259acd0219ae3bd47fd424bd274b13df7a324a28b74b669e5e958c WatchSource:0}: Error finding container 4d53e74f37259acd0219ae3bd47fd424bd274b13df7a324a28b74b669e5e958c: Status 404 returned error can't find the container with id 4d53e74f37259acd0219ae3bd47fd424bd274b13df7a324a28b74b669e5e958c Apr 20 19:20:39.350148 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.350109 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbde3fba3_4508_40ec_89b5_45e19b836b5e.slice/crio-16ce02134e278a0631757dc3f712d55d07fb8f874214d289e816a03775e7c7c9 WatchSource:0}: Error finding container 16ce02134e278a0631757dc3f712d55d07fb8f874214d289e816a03775e7c7c9: Status 404 returned error can't find the container with id 16ce02134e278a0631757dc3f712d55d07fb8f874214d289e816a03775e7c7c9 Apr 20 19:20:39.351096 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.351073 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf331ebff_9be1_4254_b31d_7bcbbc5bbf98.slice/crio-dc05c9424326b2d98a210e679a3708a8d3d8ab905d6dbe9863e4d8526827f706 WatchSource:0}: Error finding container dc05c9424326b2d98a210e679a3708a8d3d8ab905d6dbe9863e4d8526827f706: Status 404 returned error can't find the container with id dc05c9424326b2d98a210e679a3708a8d3d8ab905d6dbe9863e4d8526827f706 Apr 20 19:20:39.352294 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.352023 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode6858c48_14e7_4cc5_a1cd_0554a465f2db.slice/crio-e329a866934e3c7c0d4266454c3e2149559d97a60033a8d86eefd612a717dd68 WatchSource:0}: Error finding container e329a866934e3c7c0d4266454c3e2149559d97a60033a8d86eefd612a717dd68: Status 404 returned error can't find the container with id e329a866934e3c7c0d4266454c3e2149559d97a60033a8d86eefd612a717dd68 Apr 20 19:20:39.352846 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.352637 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09cd02b_dfa8_4f51_abdc_9e5a0b219e23.slice/crio-b8486b46c108fc21f8339b99cf016f2c42c6c8fb1d5bf1a2e3ae146aa01db2ad WatchSource:0}: Error finding container b8486b46c108fc21f8339b99cf016f2c42c6c8fb1d5bf1a2e3ae146aa01db2ad: Status 404 returned error can't find the container with id b8486b46c108fc21f8339b99cf016f2c42c6c8fb1d5bf1a2e3ae146aa01db2ad Apr 20 19:20:39.354099 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:20:39.353750 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf57b12eb_90dc_43f3_b677_c16555487307.slice/crio-4ec576b1b6ee0ecfb372ee8b35e488531410f8fbc46282cf91487c8bf44e96b9 WatchSource:0}: Error finding container 4ec576b1b6ee0ecfb372ee8b35e488531410f8fbc46282cf91487c8bf44e96b9: Status 404 returned error can't find the container with id 4ec576b1b6ee0ecfb372ee8b35e488531410f8fbc46282cf91487c8bf44e96b9 Apr 20 19:20:39.368754 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.368707 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:39.368884 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:39.368866 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:39.368946 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:39.368930 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:20:40.368910379 +0000 UTC m=+3.988592650 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:39.469228 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.469089 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:39.469228 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:39.469225 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:39.469371 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:39.469243 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:39.469371 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:39.469254 2564 projected.go:194] Error preparing data for projected volume kube-api-access-xwbpg for pod openshift-network-diagnostics/network-check-target-7nwp8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:39.469371 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:39.469294 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg podName:4f25f47c-abed-4e2e-83fe-3edbfd02b4ff nodeName:}" failed. No retries permitted until 2026-04-20 19:20:40.469281502 +0000 UTC m=+4.088963771 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-xwbpg" (UniqueName: "kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg") pod "network-check-target-7nwp8" (UID: "4f25f47c-abed-4e2e-83fe-3edbfd02b4ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:39.796912 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.796863 2564 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:15:37 +0000 UTC" deadline="2027-10-25 18:36:19.385665747 +0000 UTC" Apr 20 19:20:39.796912 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.796905 2564 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13271h15m39.588764637s" Apr 20 19:20:39.897924 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.897889 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" event={"ID":"87dc53c55f73620bf5df44e2826c141e","Type":"ContainerStarted","Data":"3b7a5472e05d405cc73a17b15cda3b49d3b1879020fcc718354f5f1441449e76"} Apr 20 19:20:39.906747 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.906716 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5c7r9" event={"ID":"f57b12eb-90dc-43f3-b677-c16555487307","Type":"ContainerStarted","Data":"4ec576b1b6ee0ecfb372ee8b35e488531410f8fbc46282cf91487c8bf44e96b9"} Apr 20 19:20:39.912470 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.912436 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nqj8h" event={"ID":"e6858c48-14e7-4cc5-a1cd-0554a465f2db","Type":"ContainerStarted","Data":"e329a866934e3c7c0d4266454c3e2149559d97a60033a8d86eefd612a717dd68"} Apr 20 19:20:39.927423 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.927378 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sljxr" event={"ID":"bde3fba3-4508-40ec-89b5-45e19b836b5e","Type":"ContainerStarted","Data":"16ce02134e278a0631757dc3f712d55d07fb8f874214d289e816a03775e7c7c9"} Apr 20 19:20:39.928781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.928726 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-62ns8" event={"ID":"69310308-59c8-4043-9117-c0e3a4104e6e","Type":"ContainerStarted","Data":"0f995fe56b9eb560310b9d95a1f233f8a10943064c880dcb7836feabeb45e6aa"} Apr 20 19:20:39.931878 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.931834 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" event={"ID":"4cf05dbd-a831-4450-baf1-a340e0113d84","Type":"ContainerStarted","Data":"fbbfa948943e089e7a3af981ff1fa53925f54dc011d212f3aa94e3365a4f0172"} Apr 20 19:20:39.939136 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.939111 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" event={"ID":"0476b44e-b1ce-4787-8794-c21ac774e74d","Type":"ContainerStarted","Data":"a21378fd2c38501dc8a0830a07cd756416cde61c382e036e8c80918f86bdb793"} Apr 20 19:20:39.945462 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.945438 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6rnf5" event={"ID":"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23","Type":"ContainerStarted","Data":"b8486b46c108fc21f8339b99cf016f2c42c6c8fb1d5bf1a2e3ae146aa01db2ad"} Apr 20 19:20:39.946901 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.946876 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerStarted","Data":"dc05c9424326b2d98a210e679a3708a8d3d8ab905d6dbe9863e4d8526827f706"} Apr 20 19:20:39.948796 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:39.948760 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"4d53e74f37259acd0219ae3bd47fd424bd274b13df7a324a28b74b669e5e958c"} Apr 20 19:20:40.375452 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.375421 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:40.375605 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.375587 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:40.375678 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.375657 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:20:42.375637519 +0000 UTC m=+5.995319789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:40.476316 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.476238 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:40.476457 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.476389 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:40.476457 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.476408 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:40.476457 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.476420 2564 projected.go:194] Error preparing data for projected volume kube-api-access-xwbpg for pod openshift-network-diagnostics/network-check-target-7nwp8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:40.476611 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.476473 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg podName:4f25f47c-abed-4e2e-83fe-3edbfd02b4ff nodeName:}" failed. No retries permitted until 2026-04-20 19:20:42.476453751 +0000 UTC m=+6.096136030 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-xwbpg" (UniqueName: "kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg") pod "network-check-target-7nwp8" (UID: "4f25f47c-abed-4e2e-83fe-3edbfd02b4ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:40.881824 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.881151 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:40.881824 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.881207 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:40.881824 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.881319 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:40.881824 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:40.881736 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:40.970523 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.969221 2564 generic.go:358] "Generic (PLEG): container finished" podID="32cc11a6fe1288d8e923d33bdeaf02c1" containerID="6b414b0f9409ad56277a47df968c93454734eac0d09fcced7390e31a0c516ae3" exitCode=0 Apr 20 19:20:40.970523 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.970314 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" event={"ID":"32cc11a6fe1288d8e923d33bdeaf02c1","Type":"ContainerDied","Data":"6b414b0f9409ad56277a47df968c93454734eac0d09fcced7390e31a0c516ae3"} Apr 20 19:20:40.988428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:40.988099 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-129-98.ec2.internal" podStartSLOduration=2.988081563 podStartE2EDuration="2.988081563s" podCreationTimestamp="2026-04-20 19:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:20:39.917745279 +0000 UTC m=+3.537427567" watchObservedRunningTime="2026-04-20 19:20:40.988081563 +0000 UTC m=+4.607763851" Apr 20 19:20:41.977062 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:41.977026 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" event={"ID":"32cc11a6fe1288d8e923d33bdeaf02c1","Type":"ContainerStarted","Data":"d8f380009139879adbce9be6c60b17bbd7500edbd2a629080db35e0624ffcc93"} Apr 20 19:20:42.389704 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:42.389570 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:42.389867 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.389702 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:42.389867 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.389784 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:20:46.389764406 +0000 UTC m=+10.009446690 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:42.490906 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:42.490850 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:42.491102 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.491069 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:42.491102 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.491089 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:42.491102 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.491100 2564 projected.go:194] Error preparing data for projected volume kube-api-access-xwbpg for pod openshift-network-diagnostics/network-check-target-7nwp8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:42.491256 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.491158 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg podName:4f25f47c-abed-4e2e-83fe-3edbfd02b4ff nodeName:}" failed. No retries permitted until 2026-04-20 19:20:46.491138934 +0000 UTC m=+10.110821220 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-xwbpg" (UniqueName: "kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg") pod "network-check-target-7nwp8" (UID: "4f25f47c-abed-4e2e-83fe-3edbfd02b4ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:42.881275 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:42.881072 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:42.881275 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.881202 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:42.881275 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:42.881073 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:42.881623 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:42.881359 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:44.881870 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:44.881365 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:44.881870 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:44.881365 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:44.881870 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:44.881500 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:44.881870 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:44.881555 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:46.422510 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:46.422369 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:46.423033 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.422545 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:46.423033 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.422604 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:20:54.422586649 +0000 UTC m=+18.042268931 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:46.522897 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:46.522855 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:46.523070 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.523033 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:46.523070 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.523051 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:46.523070 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.523062 2564 projected.go:194] Error preparing data for projected volume kube-api-access-xwbpg for pod openshift-network-diagnostics/network-check-target-7nwp8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:46.523234 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.523116 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg podName:4f25f47c-abed-4e2e-83fe-3edbfd02b4ff nodeName:}" failed. No retries permitted until 2026-04-20 19:20:54.523098722 +0000 UTC m=+18.142780991 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-xwbpg" (UniqueName: "kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg") pod "network-check-target-7nwp8" (UID: "4f25f47c-abed-4e2e-83fe-3edbfd02b4ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:46.881586 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:46.881497 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:46.881783 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:46.881616 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:46.881783 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.881641 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:46.881783 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:46.881687 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:48.880617 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:48.880579 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:48.881043 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:48.880579 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:48.881043 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:48.880715 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:48.881043 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:48.880801 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:50.880858 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:50.880824 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:50.881327 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:50.880964 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:50.881327 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:50.881031 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:50.881327 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:50.881146 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:52.880827 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:52.880793 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:52.881307 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:52.880793 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:52.881307 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:52.880926 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:52.881307 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:52.881005 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:54.480851 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:54.480804 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:54.481351 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.480984 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:54.481351 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.481076 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:21:10.481055657 +0000 UTC m=+34.100737923 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:20:54.581575 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:54.581550 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:54.581733 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.581712 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:20:54.581733 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.581730 2564 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:20:54.581811 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.581739 2564 projected.go:194] Error preparing data for projected volume kube-api-access-xwbpg for pod openshift-network-diagnostics/network-check-target-7nwp8: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:54.581811 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.581783 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg podName:4f25f47c-abed-4e2e-83fe-3edbfd02b4ff nodeName:}" failed. No retries permitted until 2026-04-20 19:21:10.581770619 +0000 UTC m=+34.201452885 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-xwbpg" (UniqueName: "kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg") pod "network-check-target-7nwp8" (UID: "4f25f47c-abed-4e2e-83fe-3edbfd02b4ff") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:20:54.881087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:54.881026 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:54.881257 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:54.881026 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:54.881257 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.881172 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:54.881257 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:54.881239 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:56.883205 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:56.882975 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:56.883695 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:56.883426 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:56.884616 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:56.883134 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:56.884707 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:56.884613 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:57.000318 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.000184 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-5c7r9" event={"ID":"f57b12eb-90dc-43f3-b677-c16555487307","Type":"ContainerStarted","Data":"7975e691d742236235eea7bebae73b8a0a07098345da3fa89bc75d2c39b02af5"} Apr 20 19:20:57.001395 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.001371 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-nqj8h" event={"ID":"e6858c48-14e7-4cc5-a1cd-0554a465f2db","Type":"ContainerStarted","Data":"e2445c7834133f7cf12829c028493f0614b66addb824df1f077e3383bd7690dc"} Apr 20 19:20:57.002528 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.002507 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" event={"ID":"4cf05dbd-a831-4450-baf1-a340e0113d84","Type":"ContainerStarted","Data":"b918f615ac6de5e44db4bb3eb4decc1084f3bce764547bc8232d5cc6b6153825"} Apr 20 19:20:57.003742 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.003721 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" event={"ID":"0476b44e-b1ce-4787-8794-c21ac774e74d","Type":"ContainerStarted","Data":"72347744253ca87de957d1c4d7ba965f05d8cf2425e0a7ec32dac063fcfa5675"} Apr 20 19:20:57.005139 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.005116 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-6rnf5" event={"ID":"a09cd02b-dfa8-4f51-abdc-9e5a0b219e23","Type":"ContainerStarted","Data":"f694104829a9d373f560da44886c58505e8ace66a3d72878b1f2269219ce7314"} Apr 20 19:20:57.006581 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.006563 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerStarted","Data":"1cd54665fe31b20d5d714c42a6563af8376db8a871597b78cf1c31d1cc781bcd"} Apr 20 19:20:57.007988 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.007963 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:20:57.008254 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.008234 2564 generic.go:358] "Generic (PLEG): container finished" podID="e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7" containerID="aace82a5841bdc8cf581dac7fca5445ee295e0b1a7b1e3df5b55ba9a3141f24b" exitCode=1 Apr 20 19:20:57.008317 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.008268 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"fdea4ad383dadcae61ece0f3aa29e00790e8ecaf68040e351f45cafe263c13bb"} Apr 20 19:20:57.008317 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.008289 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerDied","Data":"aace82a5841bdc8cf581dac7fca5445ee295e0b1a7b1e3df5b55ba9a3141f24b"} Apr 20 19:20:57.008317 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.008304 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"c9cdb18f7ba0328b596df8a5edd33415411b3a5473d71c35cd45876a0bb501ef"} Apr 20 19:20:57.025237 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.025137 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-129-98.ec2.internal" podStartSLOduration=19.025098668 podStartE2EDuration="19.025098668s" podCreationTimestamp="2026-04-20 19:20:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:20:41.992513748 +0000 UTC m=+5.612196036" watchObservedRunningTime="2026-04-20 19:20:57.025098668 +0000 UTC m=+20.644780956" Apr 20 19:20:57.025628 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.025592 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-5c7r9" podStartSLOduration=2.83849523 podStartE2EDuration="20.025583737s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.356178357 +0000 UTC m=+2.975860630" lastFinishedPulling="2026-04-20 19:20:56.543266867 +0000 UTC m=+20.162949137" observedRunningTime="2026-04-20 19:20:57.025262466 +0000 UTC m=+20.644944741" watchObservedRunningTime="2026-04-20 19:20:57.025583737 +0000 UTC m=+20.645266024" Apr 20 19:20:57.068846 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.068795 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-zf7kf" podStartSLOduration=2.8834875760000003 podStartE2EDuration="20.068777517s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.358304159 +0000 UTC m=+2.977986435" lastFinishedPulling="2026-04-20 19:20:56.543594095 +0000 UTC m=+20.163276376" observedRunningTime="2026-04-20 19:20:57.068095485 +0000 UTC m=+20.687777772" watchObservedRunningTime="2026-04-20 19:20:57.068777517 +0000 UTC m=+20.688459805" Apr 20 19:20:57.131097 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.131053 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-6rnf5" podStartSLOduration=3.942621925 podStartE2EDuration="21.131038388s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.354702681 +0000 UTC m=+2.974384948" lastFinishedPulling="2026-04-20 19:20:56.543119143 +0000 UTC m=+20.162801411" observedRunningTime="2026-04-20 19:20:57.09961588 +0000 UTC m=+20.719298166" watchObservedRunningTime="2026-04-20 19:20:57.131038388 +0000 UTC m=+20.750720674" Apr 20 19:20:57.277871 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.277826 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:57.278443 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.278426 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:57.295161 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.295122 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-nqj8h" podStartSLOduration=11.456674206 podStartE2EDuration="20.295107067s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.353974808 +0000 UTC m=+2.973657080" lastFinishedPulling="2026-04-20 19:20:48.192407661 +0000 UTC m=+11.812089941" observedRunningTime="2026-04-20 19:20:57.158740439 +0000 UTC m=+20.778422727" watchObservedRunningTime="2026-04-20 19:20:57.295107067 +0000 UTC m=+20.914789352" Apr 20 19:20:57.746026 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.745983 2564 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:20:57.817115 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.817040 2564 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:20:57.746020902Z","UUID":"9208e8df-af39-499f-9bd2-5349aeb5df02","Handler":null,"Name":"","Endpoint":""} Apr 20 19:20:57.819189 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.819172 2564 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:20:57.819267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:57.819195 2564 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:20:58.010642 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.010615 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-sljxr" event={"ID":"bde3fba3-4508-40ec-89b5-45e19b836b5e","Type":"ContainerStarted","Data":"b3827bbd07629e8f22913458c93165d41ef1342bf48c1ff99428d59975c59902"} Apr 20 19:20:58.011884 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.011856 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-62ns8" event={"ID":"69310308-59c8-4043-9117-c0e3a4104e6e","Type":"ContainerStarted","Data":"0cf18ea96bc150df21036ec8b5526520989e0646bf2d760cffc1c68e9d6302d4"} Apr 20 19:20:58.013290 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.013265 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" event={"ID":"0476b44e-b1ce-4787-8794-c21ac774e74d","Type":"ContainerStarted","Data":"8539e8b5b025489b9b4597d545e2d918feca5855de812e669297a825a4a3dd59"} Apr 20 19:20:58.014578 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.014560 2564 generic.go:358] "Generic (PLEG): container finished" podID="f331ebff-9be1-4254-b31d-7bcbbc5bbf98" containerID="1cd54665fe31b20d5d714c42a6563af8376db8a871597b78cf1c31d1cc781bcd" exitCode=0 Apr 20 19:20:58.014641 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.014625 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerDied","Data":"1cd54665fe31b20d5d714c42a6563af8376db8a871597b78cf1c31d1cc781bcd"} Apr 20 19:20:58.017313 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.017293 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:20:58.017697 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.017675 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"8a869191e58f6988bb377d997931075ab8cca7c77e6ed66a671d7c144d4279ce"} Apr 20 19:20:58.017781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.017702 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"1f9e37ffc9409bf7234bcc05ce9a6a5f606e865e508c809c12a06dae07f62c74"} Apr 20 19:20:58.017781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.017716 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"d27e3016a83fec9cb9d86a77664bd26c18ff502a8a71e57b3f8faf4e84fbe50b"} Apr 20 19:20:58.017872 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.017801 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:58.018851 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.018835 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-nqj8h" Apr 20 19:20:58.052837 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.052803 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-sljxr" podStartSLOduration=4.861890397 podStartE2EDuration="22.052793242s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.351808303 +0000 UTC m=+2.971490573" lastFinishedPulling="2026-04-20 19:20:56.54271115 +0000 UTC m=+20.162393418" observedRunningTime="2026-04-20 19:20:58.052755654 +0000 UTC m=+21.672437941" watchObservedRunningTime="2026-04-20 19:20:58.052793242 +0000 UTC m=+21.672475528" Apr 20 19:20:58.116631 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.116572 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-62ns8" podStartSLOduration=3.496551082 podStartE2EDuration="21.116561248s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.348305357 +0000 UTC m=+2.967987622" lastFinishedPulling="2026-04-20 19:20:56.96831552 +0000 UTC m=+20.587997788" observedRunningTime="2026-04-20 19:20:58.116336727 +0000 UTC m=+21.736019015" watchObservedRunningTime="2026-04-20 19:20:58.116561248 +0000 UTC m=+21.736243513" Apr 20 19:20:58.881203 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.881172 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:20:58.881450 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:58.881295 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:20:58.881450 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:58.881340 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:20:58.881612 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:20:58.881459 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:20:59.021328 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:59.021278 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" event={"ID":"0476b44e-b1ce-4787-8794-c21ac774e74d","Type":"ContainerStarted","Data":"f51ddd05bb4f079d6ed9adb9cb17e69264c3ef0e1b905dc478a4eb3a6c2142de"} Apr 20 19:20:59.057018 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:20:59.056965 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-hklj4" podStartSLOduration=3.919726433 podStartE2EDuration="23.056953598s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.357020269 +0000 UTC m=+2.976702533" lastFinishedPulling="2026-04-20 19:20:58.494247431 +0000 UTC m=+22.113929698" observedRunningTime="2026-04-20 19:20:59.056750499 +0000 UTC m=+22.676432800" watchObservedRunningTime="2026-04-20 19:20:59.056953598 +0000 UTC m=+22.676635885" Apr 20 19:21:00.026751 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:00.026553 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:21:00.027275 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:00.027109 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"36956b1c63b24833068c0270f708d5ec61431ca7c4a44d240ad606bb1e123281"} Apr 20 19:21:00.881041 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:00.881010 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:00.881041 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:00.881045 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:00.881278 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:00.881121 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:21:00.881339 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:00.881265 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:21:02.881488 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:02.881307 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:02.882103 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:02.881307 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:02.882103 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:02.881551 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:21:02.882103 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:02.881654 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:21:03.034082 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.034054 2564 generic.go:358] "Generic (PLEG): container finished" podID="f331ebff-9be1-4254-b31d-7bcbbc5bbf98" containerID="935b3d78e8b1298af00a88ab2f7faf243496e96850e05849c5e7c063f93f7dc3" exitCode=0 Apr 20 19:21:03.034292 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.034119 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerDied","Data":"935b3d78e8b1298af00a88ab2f7faf243496e96850e05849c5e7c063f93f7dc3"} Apr 20 19:21:03.037128 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.037105 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:21:03.037538 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.037516 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"a5aa6fd0d6f233a960d43bc7f376cc63ea1a9d1b403ea74b17e7451eae52fc89"} Apr 20 19:21:03.037786 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.037767 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:21:03.037786 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.037794 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:21:03.037950 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.037931 2564 scope.go:117] "RemoveContainer" containerID="aace82a5841bdc8cf581dac7fca5445ee295e0b1a7b1e3df5b55ba9a3141f24b" Apr 20 19:21:03.053165 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.053148 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:21:03.949286 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.949085 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zdlvd"] Apr 20 19:21:03.949706 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.949331 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:03.949706 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:03.949422 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:21:03.960401 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.960369 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7nwp8"] Apr 20 19:21:03.960487 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:03.960453 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:03.960540 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:03.960521 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:21:04.041507 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.041479 2564 generic.go:358] "Generic (PLEG): container finished" podID="f331ebff-9be1-4254-b31d-7bcbbc5bbf98" containerID="75e751db471f13a4d758d263a2495f4d5dd2fc3d83de90c6913e812892e43a29" exitCode=0 Apr 20 19:21:04.041654 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.041548 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerDied","Data":"75e751db471f13a4d758d263a2495f4d5dd2fc3d83de90c6913e812892e43a29"} Apr 20 19:21:04.045041 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.045023 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:21:04.045360 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.045340 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" event={"ID":"e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7","Type":"ContainerStarted","Data":"52fe9707777caf41ba7327eef4d5bfef9f865d6c4fca651da642dc65572a530b"} Apr 20 19:21:04.045571 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.045550 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:21:04.059509 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.059490 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:21:04.086633 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:04.086597 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" podStartSLOduration=9.857642944 podStartE2EDuration="27.086584772s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.351348551 +0000 UTC m=+2.971030830" lastFinishedPulling="2026-04-20 19:20:56.580290382 +0000 UTC m=+20.199972658" observedRunningTime="2026-04-20 19:21:04.085614038 +0000 UTC m=+27.705296326" watchObservedRunningTime="2026-04-20 19:21:04.086584772 +0000 UTC m=+27.706267058" Apr 20 19:21:05.049650 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:05.049620 2564 generic.go:358] "Generic (PLEG): container finished" podID="f331ebff-9be1-4254-b31d-7bcbbc5bbf98" containerID="eaa948fb6c32e3551e7eb46b2d5b1e706bfa42205486995e42dd2c38cfd2bb23" exitCode=0 Apr 20 19:21:05.050064 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:05.049705 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerDied","Data":"eaa948fb6c32e3551e7eb46b2d5b1e706bfa42205486995e42dd2c38cfd2bb23"} Apr 20 19:21:05.880741 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:05.880712 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:05.880922 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:05.880724 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:05.880922 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:05.880826 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:21:05.881056 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:05.880941 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:21:07.880609 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:07.880574 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:07.880609 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:07.880612 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:07.881234 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:07.880703 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:21:07.881234 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:07.880810 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7nwp8" podUID="4f25f47c-abed-4e2e-83fe-3edbfd02b4ff" Apr 20 19:21:09.705548 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.705461 2564 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-129-98.ec2.internal" event="NodeReady" Apr 20 19:21:09.706090 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.705634 2564 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:21:09.756454 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.756426 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-8m7qk"] Apr 20 19:21:09.782888 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.782853 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-ps5gm"] Apr 20 19:21:09.783048 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.782978 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:09.785624 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.785601 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:21:09.785745 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.785601 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:21:09.785889 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.785872 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:21:09.785960 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.785933 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ltv6t\"" Apr 20 19:21:09.798581 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.798558 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8m7qk"] Apr 20 19:21:09.799047 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.798670 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ps5gm"] Apr 20 19:21:09.799047 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.798773 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:09.801063 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.801041 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:21:09.801215 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.801194 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:21:09.801312 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.801247 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qtv7w\"" Apr 20 19:21:09.881121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.881096 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:09.881121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.881111 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:09.884305 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.884200 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gxr6p\"" Apr 20 19:21:09.884433 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.884356 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-67k9h\"" Apr 20 19:21:09.884433 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.884384 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:21:09.884433 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.884424 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:21:09.885236 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.885215 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:21:09.909166 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.909148 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5759257a-ffe8-4341-a52c-735c321d9f4a-config-volume\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:09.909278 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.909180 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:09.909278 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.909203 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5759257a-ffe8-4341-a52c-735c321d9f4a-tmp-dir\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:09.909278 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.909250 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcmjz\" (UniqueName: \"kubernetes.io/projected/5759257a-ffe8-4341-a52c-735c321d9f4a-kube-api-access-wcmjz\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:09.909425 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.909327 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:09.909425 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:09.909389 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rhs8p\" (UniqueName: \"kubernetes.io/projected/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-kube-api-access-rhs8p\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:10.010452 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.010392 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wcmjz\" (UniqueName: \"kubernetes.io/projected/5759257a-ffe8-4341-a52c-735c321d9f4a-kube-api-access-wcmjz\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.010452 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.010427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:10.010644 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.010546 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:10.010644 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.010569 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rhs8p\" (UniqueName: \"kubernetes.io/projected/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-kube-api-access-rhs8p\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:10.010644 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.010606 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:10.510587154 +0000 UTC m=+34.130269418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:10.010805 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.010668 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5759257a-ffe8-4341-a52c-735c321d9f4a-config-volume\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.010805 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.010697 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.010805 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.010726 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5759257a-ffe8-4341-a52c-735c321d9f4a-tmp-dir\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.010955 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.010879 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:10.010955 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.010941 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:21:10.510925939 +0000 UTC m=+34.130608214 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:10.011370 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.011304 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5759257a-ffe8-4341-a52c-735c321d9f4a-config-volume\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.011370 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.011336 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/5759257a-ffe8-4341-a52c-735c321d9f4a-tmp-dir\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.021883 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.021862 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcmjz\" (UniqueName: \"kubernetes.io/projected/5759257a-ffe8-4341-a52c-735c321d9f4a-kube-api-access-wcmjz\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.021983 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.021901 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rhs8p\" (UniqueName: \"kubernetes.io/projected/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-kube-api-access-rhs8p\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:10.514390 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.514357 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.514439 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.514475 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.514515 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.514589 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.514593 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.514592 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:11.514571924 +0000 UTC m=+35.134254189 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.514648 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:21:42.514632925 +0000 UTC m=+66.134315194 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : secret "metrics-daemon-secret" not found Apr 20 19:21:10.514664 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:10.514665 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:21:11.514654197 +0000 UTC m=+35.134336469 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:10.615311 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.615273 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:10.618341 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.618289 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwbpg\" (UniqueName: \"kubernetes.io/projected/4f25f47c-abed-4e2e-83fe-3edbfd02b4ff-kube-api-access-xwbpg\") pod \"network-check-target-7nwp8\" (UID: \"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff\") " pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:10.794922 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.794886 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:10.943779 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:10.943598 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7nwp8"] Apr 20 19:21:10.988501 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:21:10.988462 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f25f47c_abed_4e2e_83fe_3edbfd02b4ff.slice/crio-0f6267b00a355e0214d10aa525bd1f697b58cbad1a5000e2759d38ee0f88e5c5 WatchSource:0}: Error finding container 0f6267b00a355e0214d10aa525bd1f697b58cbad1a5000e2759d38ee0f88e5c5: Status 404 returned error can't find the container with id 0f6267b00a355e0214d10aa525bd1f697b58cbad1a5000e2759d38ee0f88e5c5 Apr 20 19:21:11.063829 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:11.063799 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7nwp8" event={"ID":"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff","Type":"ContainerStarted","Data":"0f6267b00a355e0214d10aa525bd1f697b58cbad1a5000e2759d38ee0f88e5c5"} Apr 20 19:21:11.520277 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:11.520226 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:11.520473 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:11.520331 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:11.520473 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:11.520385 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:11.520473 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:11.520439 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:11.520473 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:11.520470 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:13.52044976 +0000 UTC m=+37.140132027 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:11.520690 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:11.520486 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:21:13.520480395 +0000 UTC m=+37.140162660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:12.068529 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:12.068490 2564 generic.go:358] "Generic (PLEG): container finished" podID="f331ebff-9be1-4254-b31d-7bcbbc5bbf98" containerID="1cd11fcbe975b7e550f809aee99b24f3525d49c391fff1733790d82d689fd530" exitCode=0 Apr 20 19:21:12.069030 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:12.068559 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerDied","Data":"1cd11fcbe975b7e550f809aee99b24f3525d49c391fff1733790d82d689fd530"} Apr 20 19:21:13.073599 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:13.073549 2564 generic.go:358] "Generic (PLEG): container finished" podID="f331ebff-9be1-4254-b31d-7bcbbc5bbf98" containerID="8bba8d64b29542852d8f1f255f7efe64ce7c3c4a9b5b372b2b078c89fe6734e7" exitCode=0 Apr 20 19:21:13.073599 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:13.073598 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerDied","Data":"8bba8d64b29542852d8f1f255f7efe64ce7c3c4a9b5b372b2b078c89fe6734e7"} Apr 20 19:21:13.535655 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:13.535604 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:13.535805 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:13.535675 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:13.535805 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:13.535766 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:13.535896 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:13.535849 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:17.535827791 +0000 UTC m=+41.155510079 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:13.535896 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:13.535774 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:13.536021 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:13.535925 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:21:17.535912732 +0000 UTC m=+41.155595001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:14.078680 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:14.078500 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-7stt2" event={"ID":"f331ebff-9be1-4254-b31d-7bcbbc5bbf98","Type":"ContainerStarted","Data":"b6c1ad0b22a52ca50fb10a6b4b80a0214be8b17785584d39cf9ea2829c8ba80b"} Apr 20 19:21:14.100484 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:14.100451 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-7stt2" podStartSLOduration=5.429732517 podStartE2EDuration="37.100439115s" podCreationTimestamp="2026-04-20 19:20:37 +0000 UTC" firstStartedPulling="2026-04-20 19:20:39.353356403 +0000 UTC m=+2.973038675" lastFinishedPulling="2026-04-20 19:21:11.024062993 +0000 UTC m=+34.643745273" observedRunningTime="2026-04-20 19:21:14.099040301 +0000 UTC m=+37.718722590" watchObservedRunningTime="2026-04-20 19:21:14.100439115 +0000 UTC m=+37.720121402" Apr 20 19:21:15.082350 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:15.082306 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7nwp8" event={"ID":"4f25f47c-abed-4e2e-83fe-3edbfd02b4ff","Type":"ContainerStarted","Data":"0c7396a858d6dff284adb4710a0119295b314a4d4fe02ac18664f59f9d76ac8a"} Apr 20 19:21:15.082941 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:15.082764 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:21:15.108697 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:15.108654 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7nwp8" podStartSLOduration=36.078994141 podStartE2EDuration="39.108642177s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:21:11.003081115 +0000 UTC m=+34.622763388" lastFinishedPulling="2026-04-20 19:21:14.032729153 +0000 UTC m=+37.652411424" observedRunningTime="2026-04-20 19:21:15.107949715 +0000 UTC m=+38.727632002" watchObservedRunningTime="2026-04-20 19:21:15.108642177 +0000 UTC m=+38.728324460" Apr 20 19:21:17.561302 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:17.561269 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:17.561677 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:17.561313 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:17.561677 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:17.561399 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:17.561677 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:17.561416 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:17.561677 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:17.561460 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:21:25.561446248 +0000 UTC m=+49.181128513 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:17.561677 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:17.561472 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:25.561466769 +0000 UTC m=+49.181149033 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:19.930345 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.930309 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8"] Apr 20 19:21:19.972346 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.972321 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8"] Apr 20 19:21:19.972346 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.972351 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr"] Apr 20 19:21:19.972551 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.972482 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:19.975470 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.975447 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 19:21:19.975591 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.975451 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 19:21:19.975591 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.975453 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 19:21:19.975769 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.975755 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-c8lhb\"" Apr 20 19:21:19.976535 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.976520 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 19:21:19.990365 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.990346 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr"] Apr 20 19:21:19.990463 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.990433 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:19.993302 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.993285 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 19:21:19.993398 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.993307 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 19:21:19.993398 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.993285 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 19:21:19.993491 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:19.993391 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 19:21:20.076823 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.076798 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfz2b\" (UniqueName: \"kubernetes.io/projected/6c0da9e1-70c8-43c7-8d59-6d77918b2994-kube-api-access-hfz2b\") pod \"managed-serviceaccount-addon-agent-75c4757895-9f8v8\" (UID: \"6c0da9e1-70c8-43c7-8d59-6d77918b2994\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.076927 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.076828 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6c0da9e1-70c8-43c7-8d59-6d77918b2994-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75c4757895-9f8v8\" (UID: \"6c0da9e1-70c8-43c7-8d59-6d77918b2994\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.076927 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.076857 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.076927 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.076908 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-ca\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.077115 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.076959 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b5f6c98a-2f0a-404a-a009-b3898d447543-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.077115 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.077040 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.077115 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.077084 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-hub\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.077239 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.077113 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c6ls\" (UniqueName: \"kubernetes.io/projected/b5f6c98a-2f0a-404a-a009-b3898d447543-kube-api-access-4c6ls\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.177379 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177359 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hfz2b\" (UniqueName: \"kubernetes.io/projected/6c0da9e1-70c8-43c7-8d59-6d77918b2994-kube-api-access-hfz2b\") pod \"managed-serviceaccount-addon-agent-75c4757895-9f8v8\" (UID: \"6c0da9e1-70c8-43c7-8d59-6d77918b2994\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.177379 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177388 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6c0da9e1-70c8-43c7-8d59-6d77918b2994-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75c4757895-9f8v8\" (UID: \"6c0da9e1-70c8-43c7-8d59-6d77918b2994\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.177543 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177422 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.177582 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177536 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-ca\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.177628 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177607 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b5f6c98a-2f0a-404a-a009-b3898d447543-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.177675 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177639 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.177675 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177667 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-hub\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.177776 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.177691 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4c6ls\" (UniqueName: \"kubernetes.io/projected/b5f6c98a-2f0a-404a-a009-b3898d447543-kube-api-access-4c6ls\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.181781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.181730 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.181781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.181738 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-ca\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.181781 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.181781 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-hub\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.181945 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.181789 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6c0da9e1-70c8-43c7-8d59-6d77918b2994-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-75c4757895-9f8v8\" (UID: \"6c0da9e1-70c8-43c7-8d59-6d77918b2994\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.182057 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.182036 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/b5f6c98a-2f0a-404a-a009-b3898d447543-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.185483 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.185461 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c6ls\" (UniqueName: \"kubernetes.io/projected/b5f6c98a-2f0a-404a-a009-b3898d447543-kube-api-access-4c6ls\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.185632 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.185613 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfz2b\" (UniqueName: \"kubernetes.io/projected/6c0da9e1-70c8-43c7-8d59-6d77918b2994-kube-api-access-hfz2b\") pod \"managed-serviceaccount-addon-agent-75c4757895-9f8v8\" (UID: \"6c0da9e1-70c8-43c7-8d59-6d77918b2994\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.188408 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.188386 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/b5f6c98a-2f0a-404a-a009-b3898d447543-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-7b996cb9-xcqdr\" (UID: \"b5f6c98a-2f0a-404a-a009-b3898d447543\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.291825 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.291801 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:21:20.299641 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.299619 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:21:20.437908 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.437839 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8"] Apr 20 19:21:20.440647 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:21:20.440614 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6c0da9e1_70c8_43c7_8d59_6d77918b2994.slice/crio-3858aaac865f245202d56093a57c8a94659180fb100926e69656e729228d39bd WatchSource:0}: Error finding container 3858aaac865f245202d56093a57c8a94659180fb100926e69656e729228d39bd: Status 404 returned error can't find the container with id 3858aaac865f245202d56093a57c8a94659180fb100926e69656e729228d39bd Apr 20 19:21:20.451486 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:20.451464 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr"] Apr 20 19:21:20.455279 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:21:20.455256 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb5f6c98a_2f0a_404a_a009_b3898d447543.slice/crio-e58ad583f2e9107809713a1cf078de5ba0eccd073e9299db1d40749f216aca58 WatchSource:0}: Error finding container e58ad583f2e9107809713a1cf078de5ba0eccd073e9299db1d40749f216aca58: Status 404 returned error can't find the container with id e58ad583f2e9107809713a1cf078de5ba0eccd073e9299db1d40749f216aca58 Apr 20 19:21:21.093893 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:21.093856 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" event={"ID":"b5f6c98a-2f0a-404a-a009-b3898d447543","Type":"ContainerStarted","Data":"e58ad583f2e9107809713a1cf078de5ba0eccd073e9299db1d40749f216aca58"} Apr 20 19:21:21.094768 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:21.094744 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" event={"ID":"6c0da9e1-70c8-43c7-8d59-6d77918b2994","Type":"ContainerStarted","Data":"3858aaac865f245202d56093a57c8a94659180fb100926e69656e729228d39bd"} Apr 20 19:21:25.104951 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:25.104917 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" event={"ID":"b5f6c98a-2f0a-404a-a009-b3898d447543","Type":"ContainerStarted","Data":"c87ff4ee61ea393585f6e813834b6a1dfa571ba41f09ff0949bdee8fb44bdf69"} Apr 20 19:21:25.106082 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:25.106053 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" event={"ID":"6c0da9e1-70c8-43c7-8d59-6d77918b2994","Type":"ContainerStarted","Data":"667530dc3c90f9b55cd9bbd3a1919572a87efe3262ea9ca6445c8879008c88d6"} Apr 20 19:21:25.119764 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:25.119725 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" podStartSLOduration=1.9731166930000001 podStartE2EDuration="6.1197136s" podCreationTimestamp="2026-04-20 19:21:19 +0000 UTC" firstStartedPulling="2026-04-20 19:21:20.444392224 +0000 UTC m=+44.064074492" lastFinishedPulling="2026-04-20 19:21:24.590989128 +0000 UTC m=+48.210671399" observedRunningTime="2026-04-20 19:21:25.119070769 +0000 UTC m=+48.738753059" watchObservedRunningTime="2026-04-20 19:21:25.1197136 +0000 UTC m=+48.739395866" Apr 20 19:21:25.614445 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:25.614413 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:25.614587 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:25.614459 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:25.614587 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:25.614555 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:25.614656 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:25.614614 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.614597591 +0000 UTC m=+65.234279855 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:25.614656 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:25.614562 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:25.614742 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:25.614676 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:21:41.614664245 +0000 UTC m=+65.234346510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:28.113914 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:28.113879 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" event={"ID":"b5f6c98a-2f0a-404a-a009-b3898d447543","Type":"ContainerStarted","Data":"d394cdfa71589310a7a527e27aa1244a5177c4739266c8ae94ebde290e2dc4ff"} Apr 20 19:21:28.113914 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:28.113918 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" event={"ID":"b5f6c98a-2f0a-404a-a009-b3898d447543","Type":"ContainerStarted","Data":"0c07ce9fbdc633467eda35e2a77ef4a2594d83a4dafad321200bff4b7869558c"} Apr 20 19:21:28.135878 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:28.135829 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" podStartSLOduration=2.276498391 podStartE2EDuration="9.135815309s" podCreationTimestamp="2026-04-20 19:21:19 +0000 UTC" firstStartedPulling="2026-04-20 19:21:20.456839874 +0000 UTC m=+44.076522146" lastFinishedPulling="2026-04-20 19:21:27.316156784 +0000 UTC m=+50.935839064" observedRunningTime="2026-04-20 19:21:28.134547956 +0000 UTC m=+51.754230233" watchObservedRunningTime="2026-04-20 19:21:28.135815309 +0000 UTC m=+51.755497625" Apr 20 19:21:30.294293 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:30.294227 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" podUID="6c0da9e1-70c8-43c7-8d59-6d77918b2994" containerName="addon-agent" probeResult="failure" output="Get \"http://10.134.0.7:8000/healthz\": dial tcp 10.134.0.7:8000: connect: connection refused" Apr 20 19:21:35.133397 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:35.133367 2564 generic.go:358] "Generic (PLEG): container finished" podID="6c0da9e1-70c8-43c7-8d59-6d77918b2994" containerID="667530dc3c90f9b55cd9bbd3a1919572a87efe3262ea9ca6445c8879008c88d6" exitCode=255 Apr 20 19:21:35.133830 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:35.133411 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" event={"ID":"6c0da9e1-70c8-43c7-8d59-6d77918b2994","Type":"ContainerDied","Data":"667530dc3c90f9b55cd9bbd3a1919572a87efe3262ea9ca6445c8879008c88d6"} Apr 20 19:21:35.133830 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:35.133760 2564 scope.go:117] "RemoveContainer" containerID="667530dc3c90f9b55cd9bbd3a1919572a87efe3262ea9ca6445c8879008c88d6" Apr 20 19:21:36.062402 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:36.062363 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-f5z9f" Apr 20 19:21:36.138673 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:36.138644 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" event={"ID":"6c0da9e1-70c8-43c7-8d59-6d77918b2994","Type":"ContainerStarted","Data":"7768258c54259be5caa8d8ab82de3b5645fe175b802cb894eafd1255157112c3"} Apr 20 19:21:41.618954 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:41.618913 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:21:41.619355 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:41.618969 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:21:41.619355 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:41.619093 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:21:41.619355 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:41.619156 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:22:13.619141602 +0000 UTC m=+97.238823872 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:21:41.619355 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:41.619093 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:21:41.619355 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:41.619218 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:22:13.619207117 +0000 UTC m=+97.238889383 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:21:42.523483 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:42.523448 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:21:42.523638 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:42.523560 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:21:42.523638 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:21:42.523612 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:22:46.523597883 +0000 UTC m=+130.143280147 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : secret "metrics-daemon-secret" not found Apr 20 19:21:47.087509 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:21:47.087478 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7nwp8" Apr 20 19:22:13.629394 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:22:13.629344 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:22:13.629808 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:22:13.629410 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:22:13.629808 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:22:13.629521 2564 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:22:13.629808 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:22:13.629535 2564 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:22:13.629808 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:22:13.629595 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert podName:5ca54d9a-af5f-4f4f-b135-bfc6c5824e75 nodeName:}" failed. No retries permitted until 2026-04-20 19:23:17.629578489 +0000 UTC m=+161.249260756 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert") pod "ingress-canary-8m7qk" (UID: "5ca54d9a-af5f-4f4f-b135-bfc6c5824e75") : secret "canary-serving-cert" not found Apr 20 19:22:13.629808 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:22:13.629611 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls podName:5759257a-ffe8-4341-a52c-735c321d9f4a nodeName:}" failed. No retries permitted until 2026-04-20 19:23:17.629604595 +0000 UTC m=+161.249286860 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls") pod "dns-default-ps5gm" (UID: "5759257a-ffe8-4341-a52c-735c321d9f4a") : secret "dns-default-metrics-tls" not found Apr 20 19:22:46.541851 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:22:46.541808 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:22:46.542380 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:22:46.541939 2564 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:22:46.542380 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:22:46.542025 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs podName:9b4f6ab3-fddd-446f-8cbf-e372e1b901fe nodeName:}" failed. No retries permitted until 2026-04-20 19:24:48.541988862 +0000 UTC m=+252.161671127 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs") pod "network-metrics-daemon-zdlvd" (UID: "9b4f6ab3-fddd-446f-8cbf-e372e1b901fe") : secret "metrics-daemon-secret" not found Apr 20 19:22:48.586259 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:22:48.586232 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6rnf5_a09cd02b-dfa8-4f51-abdc-9e5a0b219e23/dns-node-resolver/0.log" Apr 20 19:22:49.386303 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:22:49.386277 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5c7r9_f57b12eb-90dc-43f3-b677-c16555487307/node-ca/0.log" Apr 20 19:23:12.793667 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:12.793624 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-8m7qk" podUID="5ca54d9a-af5f-4f4f-b135-bfc6c5824e75" Apr 20 19:23:12.808794 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:12.808763 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-ps5gm" podUID="5759257a-ffe8-4341-a52c-735c321d9f4a" Apr 20 19:23:12.901030 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:12.900969 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zdlvd" podUID="9b4f6ab3-fddd-446f-8cbf-e372e1b901fe" Apr 20 19:23:13.356258 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:13.356230 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:23:17.664920 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.664871 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:23:17.665342 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.664963 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:23:17.667254 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.667232 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/5759257a-ffe8-4341-a52c-735c321d9f4a-metrics-tls\") pod \"dns-default-ps5gm\" (UID: \"5759257a-ffe8-4341-a52c-735c321d9f4a\") " pod="openshift-dns/dns-default-ps5gm" Apr 20 19:23:17.667361 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.667310 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5ca54d9a-af5f-4f4f-b135-bfc6c5824e75-cert\") pod \"ingress-canary-8m7qk\" (UID: \"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75\") " pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:23:17.859239 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.859207 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ltv6t\"" Apr 20 19:23:17.867468 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.867440 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-8m7qk" Apr 20 19:23:17.981317 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:17.981285 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-8m7qk"] Apr 20 19:23:17.984874 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:23:17.984847 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ca54d9a_af5f_4f4f_b135_bfc6c5824e75.slice/crio-6683fcbded690b5c890908d999c5cc707a1dec13402b2e9dcbe2e8c447e0e09a WatchSource:0}: Error finding container 6683fcbded690b5c890908d999c5cc707a1dec13402b2e9dcbe2e8c447e0e09a: Status 404 returned error can't find the container with id 6683fcbded690b5c890908d999c5cc707a1dec13402b2e9dcbe2e8c447e0e09a Apr 20 19:23:18.369012 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.368960 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8m7qk" event={"ID":"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75","Type":"ContainerStarted","Data":"6683fcbded690b5c890908d999c5cc707a1dec13402b2e9dcbe2e8c447e0e09a"} Apr 20 19:23:18.712230 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.712157 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-52pdc"] Apr 20 19:23:18.715340 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.715312 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.718970 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.718785 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:23:18.719277 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.719256 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:23:18.720511 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.720456 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:23:18.720807 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.720786 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-d2rzs\"" Apr 20 19:23:18.720909 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.720836 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:23:18.725732 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.725686 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-52pdc"] Apr 20 19:23:18.874056 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.874007 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.874056 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.874063 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-data-volume\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.874273 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.874089 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ns9sm\" (UniqueName: \"kubernetes.io/projected/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-kube-api-access-ns9sm\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.874273 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.874112 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-crio-socket\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.874273 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.874215 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975109 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975029 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975258 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975117 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975258 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975149 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-data-volume\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975258 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975176 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ns9sm\" (UniqueName: \"kubernetes.io/projected/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-kube-api-access-ns9sm\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975258 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975207 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-crio-socket\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975461 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975298 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-crio-socket\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.975784 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.975750 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-data-volume\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.976042 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.976018 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.977666 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.977644 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:18.986747 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:18.986722 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ns9sm\" (UniqueName: \"kubernetes.io/projected/5e519934-7cb4-45a9-9e43-0ada33d7fc5c-kube-api-access-ns9sm\") pod \"insights-runtime-extractor-52pdc\" (UID: \"5e519934-7cb4-45a9-9e43-0ada33d7fc5c\") " pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:19.027940 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:19.027912 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-52pdc" Apr 20 19:23:19.465723 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:19.465698 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-52pdc"] Apr 20 19:23:19.468215 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:23:19.468192 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5e519934_7cb4_45a9_9e43_0ada33d7fc5c.slice/crio-0e479bc7142a69fc413e5e7c5dcf885a090d9db1ae00e204f80abe5790f29ee5 WatchSource:0}: Error finding container 0e479bc7142a69fc413e5e7c5dcf885a090d9db1ae00e204f80abe5790f29ee5: Status 404 returned error can't find the container with id 0e479bc7142a69fc413e5e7c5dcf885a090d9db1ae00e204f80abe5790f29ee5 Apr 20 19:23:20.375376 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:20.375298 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-52pdc" event={"ID":"5e519934-7cb4-45a9-9e43-0ada33d7fc5c","Type":"ContainerStarted","Data":"564d70efa55dbf95cceeef75d5542bdc68587c416c5bbc57a70dc1b3f3f3f5ae"} Apr 20 19:23:20.375376 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:20.375337 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-52pdc" event={"ID":"5e519934-7cb4-45a9-9e43-0ada33d7fc5c","Type":"ContainerStarted","Data":"6504950ffcba8cff0791ee95eea8240eeac130b4ba20d9c28d2db25b61a91f68"} Apr 20 19:23:20.375376 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:20.375352 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-52pdc" event={"ID":"5e519934-7cb4-45a9-9e43-0ada33d7fc5c","Type":"ContainerStarted","Data":"0e479bc7142a69fc413e5e7c5dcf885a090d9db1ae00e204f80abe5790f29ee5"} Apr 20 19:23:20.376478 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:20.376455 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-8m7qk" event={"ID":"5ca54d9a-af5f-4f4f-b135-bfc6c5824e75","Type":"ContainerStarted","Data":"f20682528a4cba7c5624985a04954ade5ef0decc6fb20e80db0f8a2b4f4ebbdc"} Apr 20 19:23:20.394197 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:20.394152 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-8m7qk" podStartSLOduration=129.977417713 podStartE2EDuration="2m11.394141518s" podCreationTimestamp="2026-04-20 19:21:09 +0000 UTC" firstStartedPulling="2026-04-20 19:23:17.986494336 +0000 UTC m=+161.606176600" lastFinishedPulling="2026-04-20 19:23:19.40321814 +0000 UTC m=+163.022900405" observedRunningTime="2026-04-20 19:23:20.39250874 +0000 UTC m=+164.012191027" watchObservedRunningTime="2026-04-20 19:23:20.394141518 +0000 UTC m=+164.013823804" Apr 20 19:23:22.382879 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.382844 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-52pdc" event={"ID":"5e519934-7cb4-45a9-9e43-0ada33d7fc5c","Type":"ContainerStarted","Data":"4b72fa09d1b1a4f02c85e756ceaa15476a7617c8667a2c69a3eef6292fb8d0ab"} Apr 20 19:23:22.825859 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.825813 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-52pdc" podStartSLOduration=2.824400854 podStartE2EDuration="4.825798076s" podCreationTimestamp="2026-04-20 19:23:18 +0000 UTC" firstStartedPulling="2026-04-20 19:23:19.531224542 +0000 UTC m=+163.150906806" lastFinishedPulling="2026-04-20 19:23:21.532621759 +0000 UTC m=+165.152304028" observedRunningTime="2026-04-20 19:23:22.399597032 +0000 UTC m=+166.019279319" watchObservedRunningTime="2026-04-20 19:23:22.825798076 +0000 UTC m=+166.445480362" Apr 20 19:23:22.826127 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.826110 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m9nwj"] Apr 20 19:23:22.829173 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.829154 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:22.831542 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.831520 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 20 19:23:22.831650 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.831537 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 20 19:23:22.832651 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.832622 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-vtvpx\"" Apr 20 19:23:22.832753 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.832709 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:23:22.832753 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.832711 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:23:22.832753 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.832740 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:23:22.837102 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:22.837079 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m9nwj"] Apr 20 19:23:23.002694 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.002662 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d7f65a-133b-4803-9e61-e6081374a2ef-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.002694 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.002693 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bwv74\" (UniqueName: \"kubernetes.io/projected/15d7f65a-133b-4803-9e61-e6081374a2ef-kube-api-access-bwv74\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.002868 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.002718 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/15d7f65a-133b-4803-9e61-e6081374a2ef-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.002868 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.002751 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15d7f65a-133b-4803-9e61-e6081374a2ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.103262 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.103191 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d7f65a-133b-4803-9e61-e6081374a2ef-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.103262 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.103222 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bwv74\" (UniqueName: \"kubernetes.io/projected/15d7f65a-133b-4803-9e61-e6081374a2ef-kube-api-access-bwv74\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.103262 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.103241 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/15d7f65a-133b-4803-9e61-e6081374a2ef-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.103481 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.103364 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15d7f65a-133b-4803-9e61-e6081374a2ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.103978 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.103948 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/15d7f65a-133b-4803-9e61-e6081374a2ef-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.105526 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.105507 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/15d7f65a-133b-4803-9e61-e6081374a2ef-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.105643 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.105622 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/15d7f65a-133b-4803-9e61-e6081374a2ef-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.111410 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.111386 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bwv74\" (UniqueName: \"kubernetes.io/projected/15d7f65a-133b-4803-9e61-e6081374a2ef-kube-api-access-bwv74\") pod \"prometheus-operator-5676c8c784-m9nwj\" (UID: \"15d7f65a-133b-4803-9e61-e6081374a2ef\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.138745 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.138712 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" Apr 20 19:23:23.250427 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.250373 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-m9nwj"] Apr 20 19:23:23.254850 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:23:23.254816 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15d7f65a_133b_4803_9e61_e6081374a2ef.slice/crio-f192e6f22203b8ee97e098754c7f3bbe08fb2507e74ca1508529e61d2a447746 WatchSource:0}: Error finding container f192e6f22203b8ee97e098754c7f3bbe08fb2507e74ca1508529e61d2a447746: Status 404 returned error can't find the container with id f192e6f22203b8ee97e098754c7f3bbe08fb2507e74ca1508529e61d2a447746 Apr 20 19:23:23.385555 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.385473 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" event={"ID":"15d7f65a-133b-4803-9e61-e6081374a2ef","Type":"ContainerStarted","Data":"f192e6f22203b8ee97e098754c7f3bbe08fb2507e74ca1508529e61d2a447746"} Apr 20 19:23:23.881570 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.881534 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ps5gm" Apr 20 19:23:23.884363 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.884315 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-qtv7w\"" Apr 20 19:23:23.892507 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:23.892487 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-ps5gm" Apr 20 19:23:24.017533 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:24.017474 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-ps5gm"] Apr 20 19:23:24.021244 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:23:24.021214 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5759257a_ffe8_4341_a52c_735c321d9f4a.slice/crio-2fdc2e963da8b16eac23b6c882dfa4c588004cce44ae5c94857e2c64e571bde7 WatchSource:0}: Error finding container 2fdc2e963da8b16eac23b6c882dfa4c588004cce44ae5c94857e2c64e571bde7: Status 404 returned error can't find the container with id 2fdc2e963da8b16eac23b6c882dfa4c588004cce44ae5c94857e2c64e571bde7 Apr 20 19:23:24.388876 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:24.388841 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ps5gm" event={"ID":"5759257a-ffe8-4341-a52c-735c321d9f4a","Type":"ContainerStarted","Data":"2fdc2e963da8b16eac23b6c882dfa4c588004cce44ae5c94857e2c64e571bde7"} Apr 20 19:23:25.392392 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:25.392364 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" event={"ID":"15d7f65a-133b-4803-9e61-e6081374a2ef","Type":"ContainerStarted","Data":"4bff33179fc14009231cbe6723613b8f18df5576318d70bff5114121268979c9"} Apr 20 19:23:25.392732 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:25.392397 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" event={"ID":"15d7f65a-133b-4803-9e61-e6081374a2ef","Type":"ContainerStarted","Data":"ff321fd3273be1e8bbcc7f5fa55c8638b9a279e533eed315b20e16a6f7fd8d86"} Apr 20 19:23:25.409215 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:25.409176 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-m9nwj" podStartSLOduration=2.245618888 podStartE2EDuration="3.409162123s" podCreationTimestamp="2026-04-20 19:23:22 +0000 UTC" firstStartedPulling="2026-04-20 19:23:23.25664414 +0000 UTC m=+166.876326405" lastFinishedPulling="2026-04-20 19:23:24.420187373 +0000 UTC m=+168.039869640" observedRunningTime="2026-04-20 19:23:25.407667964 +0000 UTC m=+169.027350263" watchObservedRunningTime="2026-04-20 19:23:25.409162123 +0000 UTC m=+169.028844410" Apr 20 19:23:26.396026 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:26.395977 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ps5gm" event={"ID":"5759257a-ffe8-4341-a52c-735c321d9f4a","Type":"ContainerStarted","Data":"0c317b692615c75a67e72858aa4e1a1acfb69b4357e514851b8370d576c3689e"} Apr 20 19:23:26.396026 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:26.396027 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-ps5gm" event={"ID":"5759257a-ffe8-4341-a52c-735c321d9f4a","Type":"ContainerStarted","Data":"979cd4140e3789077bdd0a8d78ae870b53ee0756776aa166d37ea1e0fdbdb227"} Apr 20 19:23:26.413169 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:26.413121 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-ps5gm" podStartSLOduration=136.135235423 podStartE2EDuration="2m17.413105891s" podCreationTimestamp="2026-04-20 19:21:09 +0000 UTC" firstStartedPulling="2026-04-20 19:23:24.025394145 +0000 UTC m=+167.645076410" lastFinishedPulling="2026-04-20 19:23:25.303264613 +0000 UTC m=+168.922946878" observedRunningTime="2026-04-20 19:23:26.411348873 +0000 UTC m=+170.031031160" watchObservedRunningTime="2026-04-20 19:23:26.413105891 +0000 UTC m=+170.032788178" Apr 20 19:23:26.884786 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:26.884760 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:23:27.189742 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.189660 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hmzfb"] Apr 20 19:23:27.192863 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.192845 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.195136 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.195110 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:23:27.195456 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.195440 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-kfq2q\"" Apr 20 19:23:27.195758 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.195742 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:23:27.195918 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.195901 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:23:27.332150 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332106 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-wtmp\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332150 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332149 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/724ddbfe-066b-4314-9b9b-de1647c2747b-metrics-client-ca\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332188 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-tls\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332234 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-root\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332255 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-sys\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332282 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332302 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69hlt\" (UniqueName: \"kubernetes.io/projected/724ddbfe-066b-4314-9b9b-de1647c2747b-kube-api-access-69hlt\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332328 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.332591 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.332394 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-textfile\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.401185 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.399581 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-ps5gm" Apr 20 19:23:27.432767 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432732 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-sys\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432767 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432770 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432791 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69hlt\" (UniqueName: \"kubernetes.io/projected/724ddbfe-066b-4314-9b9b-de1647c2747b-kube-api-access-69hlt\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432811 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432838 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-textfile\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432875 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-wtmp\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432834 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-sys\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.432959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432900 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/724ddbfe-066b-4314-9b9b-de1647c2747b-metrics-client-ca\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.432964 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-tls\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.433063 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-wtmp\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.433072 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-root\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.433142 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/724ddbfe-066b-4314-9b9b-de1647c2747b-root\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:27.433166 2564 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:27.433235 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-tls podName:724ddbfe-066b-4314-9b9b-de1647c2747b nodeName:}" failed. No retries permitted until 2026-04-20 19:23:27.933214332 +0000 UTC m=+171.552896611 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-tls") pod "node-exporter-hmzfb" (UID: "724ddbfe-066b-4314-9b9b-de1647c2747b") : secret "node-exporter-tls" not found Apr 20 19:23:27.433267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.433238 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-textfile\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433556 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.433436 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.433556 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.433498 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/724ddbfe-066b-4314-9b9b-de1647c2747b-metrics-client-ca\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.435170 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.435147 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.441608 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.441548 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69hlt\" (UniqueName: \"kubernetes.io/projected/724ddbfe-066b-4314-9b9b-de1647c2747b-kube-api-access-69hlt\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.938124 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.938093 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-tls\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:27.940319 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:27.940301 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/724ddbfe-066b-4314-9b9b-de1647c2747b-node-exporter-tls\") pod \"node-exporter-hmzfb\" (UID: \"724ddbfe-066b-4314-9b9b-de1647c2747b\") " pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:28.101609 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:28.101581 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hmzfb" Apr 20 19:23:28.109389 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:23:28.109364 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod724ddbfe_066b_4314_9b9b_de1647c2747b.slice/crio-311b9507bf8b318321ea3c64c241a69f1e905009725ab1f0ceae5b936d6a3eb1 WatchSource:0}: Error finding container 311b9507bf8b318321ea3c64c241a69f1e905009725ab1f0ceae5b936d6a3eb1: Status 404 returned error can't find the container with id 311b9507bf8b318321ea3c64c241a69f1e905009725ab1f0ceae5b936d6a3eb1 Apr 20 19:23:28.402343 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:28.402300 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmzfb" event={"ID":"724ddbfe-066b-4314-9b9b-de1647c2747b","Type":"ContainerStarted","Data":"311b9507bf8b318321ea3c64c241a69f1e905009725ab1f0ceae5b936d6a3eb1"} Apr 20 19:23:29.406852 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:29.406812 2564 generic.go:358] "Generic (PLEG): container finished" podID="724ddbfe-066b-4314-9b9b-de1647c2747b" containerID="a6b0bf2e0722118c619deafaecaf5f88c36180ec8d51afe9f66744ed53e03bbd" exitCode=0 Apr 20 19:23:29.407221 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:29.406878 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmzfb" event={"ID":"724ddbfe-066b-4314-9b9b-de1647c2747b","Type":"ContainerDied","Data":"a6b0bf2e0722118c619deafaecaf5f88c36180ec8d51afe9f66744ed53e03bbd"} Apr 20 19:23:30.412072 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:30.412035 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmzfb" event={"ID":"724ddbfe-066b-4314-9b9b-de1647c2747b","Type":"ContainerStarted","Data":"f47de89df0a185ed28175428bb4e5f26ba0047e5f909bf9a91f4c52a49cef053"} Apr 20 19:23:30.412072 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:30.412069 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmzfb" event={"ID":"724ddbfe-066b-4314-9b9b-de1647c2747b","Type":"ContainerStarted","Data":"00b1a95610ed5a2e19ed7273361bde21d8d6fa5a61162191774b13fb5be8c7c7"} Apr 20 19:23:30.430688 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:30.430646 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hmzfb" podStartSLOduration=2.564919807 podStartE2EDuration="3.430633968s" podCreationTimestamp="2026-04-20 19:23:27 +0000 UTC" firstStartedPulling="2026-04-20 19:23:28.111151129 +0000 UTC m=+171.730833395" lastFinishedPulling="2026-04-20 19:23:28.976865271 +0000 UTC m=+172.596547556" observedRunningTime="2026-04-20 19:23:30.430426303 +0000 UTC m=+174.050108589" watchObservedRunningTime="2026-04-20 19:23:30.430633968 +0000 UTC m=+174.050316256" Apr 20 19:23:31.593437 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.593407 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7d465989d-274nv"] Apr 20 19:23:31.596504 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.596482 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.598851 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.598820 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 20 19:23:31.599939 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.599919 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 20 19:23:31.600093 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.599935 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 20 19:23:31.600093 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.600083 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-hj8xk\"" Apr 20 19:23:31.600093 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.599935 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 20 19:23:31.600285 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.599945 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-b10r91v9o2p4n\"" Apr 20 19:23:31.604642 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.604612 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d465989d-274nv"] Apr 20 19:23:31.767304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767253 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/15a24773-49ae-4109-a012-83724a1e5f19-audit-log\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.767304 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767306 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-secret-metrics-server-client-certs\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.767543 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767401 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-secret-metrics-server-tls\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.767543 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767450 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15a24773-49ae-4109-a012-83724a1e5f19-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.767543 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767516 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/15a24773-49ae-4109-a012-83724a1e5f19-metrics-server-audit-profiles\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.767678 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767548 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-client-ca-bundle\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.767678 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.767566 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkfcf\" (UniqueName: \"kubernetes.io/projected/15a24773-49ae-4109-a012-83724a1e5f19-kube-api-access-jkfcf\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868322 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/15a24773-49ae-4109-a012-83724a1e5f19-metrics-server-audit-profiles\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868375 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-client-ca-bundle\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868429 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jkfcf\" (UniqueName: \"kubernetes.io/projected/15a24773-49ae-4109-a012-83724a1e5f19-kube-api-access-jkfcf\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868698 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868495 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/15a24773-49ae-4109-a012-83724a1e5f19-audit-log\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868698 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868520 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-secret-metrics-server-client-certs\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868698 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868586 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-secret-metrics-server-tls\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868698 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868658 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15a24773-49ae-4109-a012-83724a1e5f19-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.868977 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.868955 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/15a24773-49ae-4109-a012-83724a1e5f19-audit-log\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.869484 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.869461 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/15a24773-49ae-4109-a012-83724a1e5f19-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.869741 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.869718 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/15a24773-49ae-4109-a012-83724a1e5f19-metrics-server-audit-profiles\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.871055 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.871030 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-secret-metrics-server-tls\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.871137 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.871067 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-client-ca-bundle\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.871137 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.871077 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/15a24773-49ae-4109-a012-83724a1e5f19-secret-metrics-server-client-certs\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.876347 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.876322 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkfcf\" (UniqueName: \"kubernetes.io/projected/15a24773-49ae-4109-a012-83724a1e5f19-kube-api-access-jkfcf\") pod \"metrics-server-7d465989d-274nv\" (UID: \"15a24773-49ae-4109-a012-83724a1e5f19\") " pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:31.905818 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:31.905778 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:32.028226 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:32.028197 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7d465989d-274nv"] Apr 20 19:23:32.030905 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:23:32.030875 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15a24773_49ae_4109_a012_83724a1e5f19.slice/crio-240031ae1ab33af30f844a18815c70d4b1201ddcea873ac92b119fbd874e9832 WatchSource:0}: Error finding container 240031ae1ab33af30f844a18815c70d4b1201ddcea873ac92b119fbd874e9832: Status 404 returned error can't find the container with id 240031ae1ab33af30f844a18815c70d4b1201ddcea873ac92b119fbd874e9832 Apr 20 19:23:32.418881 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:32.418838 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d465989d-274nv" event={"ID":"15a24773-49ae-4109-a012-83724a1e5f19","Type":"ContainerStarted","Data":"240031ae1ab33af30f844a18815c70d4b1201ddcea873ac92b119fbd874e9832"} Apr 20 19:23:34.425547 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:34.425514 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7d465989d-274nv" event={"ID":"15a24773-49ae-4109-a012-83724a1e5f19","Type":"ContainerStarted","Data":"ea493d4f297614142b740243561505c2dd0e8acb27b6b3b0157ddc291dc6551a"} Apr 20 19:23:34.442835 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:34.442795 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7d465989d-274nv" podStartSLOduration=1.968527492 podStartE2EDuration="3.442781561s" podCreationTimestamp="2026-04-20 19:23:31 +0000 UTC" firstStartedPulling="2026-04-20 19:23:32.032864345 +0000 UTC m=+175.652546623" lastFinishedPulling="2026-04-20 19:23:33.507118426 +0000 UTC m=+177.126800692" observedRunningTime="2026-04-20 19:23:34.442717542 +0000 UTC m=+178.062399830" watchObservedRunningTime="2026-04-20 19:23:34.442781561 +0000 UTC m=+178.062463848" Apr 20 19:23:35.429445 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:35.429418 2564 generic.go:358] "Generic (PLEG): container finished" podID="6c0da9e1-70c8-43c7-8d59-6d77918b2994" containerID="7768258c54259be5caa8d8ab82de3b5645fe175b802cb894eafd1255157112c3" exitCode=255 Apr 20 19:23:35.429790 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:35.429501 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" event={"ID":"6c0da9e1-70c8-43c7-8d59-6d77918b2994","Type":"ContainerDied","Data":"7768258c54259be5caa8d8ab82de3b5645fe175b802cb894eafd1255157112c3"} Apr 20 19:23:35.429790 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:35.429549 2564 scope.go:117] "RemoveContainer" containerID="667530dc3c90f9b55cd9bbd3a1919572a87efe3262ea9ca6445c8879008c88d6" Apr 20 19:23:35.429934 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:35.429918 2564 scope.go:117] "RemoveContainer" containerID="7768258c54259be5caa8d8ab82de3b5645fe175b802cb894eafd1255157112c3" Apr 20 19:23:35.430125 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:35.430105 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-75c4757895-9f8v8_open-cluster-management-agent-addon(6c0da9e1-70c8-43c7-8d59-6d77918b2994)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" podUID="6c0da9e1-70c8-43c7-8d59-6d77918b2994" Apr 20 19:23:37.405063 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:37.405032 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-ps5gm" Apr 20 19:23:40.292681 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:40.292644 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" Apr 20 19:23:40.293064 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:40.292988 2564 scope.go:117] "RemoveContainer" containerID="7768258c54259be5caa8d8ab82de3b5645fe175b802cb894eafd1255157112c3" Apr 20 19:23:40.293198 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:23:40.293181 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"addon-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=addon-agent pod=managed-serviceaccount-addon-agent-75c4757895-9f8v8_open-cluster-management-agent-addon(6c0da9e1-70c8-43c7-8d59-6d77918b2994)\"" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" podUID="6c0da9e1-70c8-43c7-8d59-6d77918b2994" Apr 20 19:23:51.881287 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:51.881257 2564 scope.go:117] "RemoveContainer" containerID="7768258c54259be5caa8d8ab82de3b5645fe175b802cb894eafd1255157112c3" Apr 20 19:23:51.906696 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:51.906673 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:51.906784 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:51.906704 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:23:52.475928 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:23:52.475897 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-75c4757895-9f8v8" event={"ID":"6c0da9e1-70c8-43c7-8d59-6d77918b2994","Type":"ContainerStarted","Data":"13ccc42d2da3403ac6824311d8e4390882dac30974b61c18d1bbf16315afefc2"} Apr 20 19:24:05.198136 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:05.198105 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7d465989d-274nv_15a24773-49ae-4109-a012-83724a1e5f19/metrics-server/0.log" Apr 20 19:24:05.593026 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:05.592983 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmzfb_724ddbfe-066b-4314-9b9b-de1647c2747b/init-textfile/0.log" Apr 20 19:24:05.793577 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:05.793551 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmzfb_724ddbfe-066b-4314-9b9b-de1647c2747b/node-exporter/0.log" Apr 20 19:24:05.993272 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:05.993202 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmzfb_724ddbfe-066b-4314-9b9b-de1647c2747b/kube-rbac-proxy/0.log" Apr 20 19:24:09.394822 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:09.394790 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m9nwj_15d7f65a-133b-4803-9e61-e6081374a2ef/prometheus-operator/0.log" Apr 20 19:24:09.591922 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:09.591889 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m9nwj_15d7f65a-133b-4803-9e61-e6081374a2ef/kube-rbac-proxy/0.log" Apr 20 19:24:11.911822 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:11.911792 2564 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:24:11.915712 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:11.915690 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7d465989d-274nv" Apr 20 19:24:12.793134 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:12.793103 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8m7qk_5ca54d9a-af5f-4f4f-b135-bfc6c5824e75/serve-healthcheck-canary/0.log" Apr 20 19:24:20.301273 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:20.301232 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" podUID="b5f6c98a-2f0a-404a-a009-b3898d447543" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:24:30.300725 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:30.300684 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" podUID="b5f6c98a-2f0a-404a-a009-b3898d447543" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:24:40.301132 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.301080 2564 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" podUID="b5f6c98a-2f0a-404a-a009-b3898d447543" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:24:40.301690 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.301174 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" Apr 20 19:24:40.301822 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.301786 2564 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"d394cdfa71589310a7a527e27aa1244a5177c4739266c8ae94ebde290e2dc4ff"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 19:24:40.301886 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.301866 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" podUID="b5f6c98a-2f0a-404a-a009-b3898d447543" containerName="service-proxy" containerID="cri-o://d394cdfa71589310a7a527e27aa1244a5177c4739266c8ae94ebde290e2dc4ff" gracePeriod=30 Apr 20 19:24:40.598503 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.598425 2564 generic.go:358] "Generic (PLEG): container finished" podID="b5f6c98a-2f0a-404a-a009-b3898d447543" containerID="d394cdfa71589310a7a527e27aa1244a5177c4739266c8ae94ebde290e2dc4ff" exitCode=2 Apr 20 19:24:40.598503 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.598448 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" event={"ID":"b5f6c98a-2f0a-404a-a009-b3898d447543","Type":"ContainerDied","Data":"d394cdfa71589310a7a527e27aa1244a5177c4739266c8ae94ebde290e2dc4ff"} Apr 20 19:24:40.598503 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:40.598489 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-7b996cb9-xcqdr" event={"ID":"b5f6c98a-2f0a-404a-a009-b3898d447543","Type":"ContainerStarted","Data":"6ab346e62fa2b17bcc5ee1bc0522d54005272ecd2cb8ef285c53dc3d53b65f22"} Apr 20 19:24:48.609799 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:48.609757 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:24:48.612259 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:48.612236 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9b4f6ab3-fddd-446f-8cbf-e372e1b901fe-metrics-certs\") pod \"network-metrics-daemon-zdlvd\" (UID: \"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe\") " pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:24:48.788755 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:48.788727 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-gxr6p\"" Apr 20 19:24:48.796670 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:48.796650 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zdlvd" Apr 20 19:24:48.910845 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:48.910816 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zdlvd"] Apr 20 19:24:48.913901 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:24:48.913872 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4f6ab3_fddd_446f_8cbf_e372e1b901fe.slice/crio-a678bf38a41b7f82a75fb6f8f76c30629b002be72267722c744d8216900dafe3 WatchSource:0}: Error finding container a678bf38a41b7f82a75fb6f8f76c30629b002be72267722c744d8216900dafe3: Status 404 returned error can't find the container with id a678bf38a41b7f82a75fb6f8f76c30629b002be72267722c744d8216900dafe3 Apr 20 19:24:49.622989 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:49.622937 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zdlvd" event={"ID":"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe","Type":"ContainerStarted","Data":"a678bf38a41b7f82a75fb6f8f76c30629b002be72267722c744d8216900dafe3"} Apr 20 19:24:50.628674 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:50.628639 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zdlvd" event={"ID":"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe","Type":"ContainerStarted","Data":"f1b8ffd010b14c0356a003f318b5895b4266771a4936a295ff5fc48a7d46f943"} Apr 20 19:24:50.628674 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:50.628673 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zdlvd" event={"ID":"9b4f6ab3-fddd-446f-8cbf-e372e1b901fe","Type":"ContainerStarted","Data":"1c43d843f225d8bf53600278d4ad1d054a4d80b0dd7139474f289614d6a25e30"} Apr 20 19:24:50.646224 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:24:50.646162 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zdlvd" podStartSLOduration=253.462669529 podStartE2EDuration="4m14.646144177s" podCreationTimestamp="2026-04-20 19:20:36 +0000 UTC" firstStartedPulling="2026-04-20 19:24:48.915648003 +0000 UTC m=+252.535330268" lastFinishedPulling="2026-04-20 19:24:50.09912265 +0000 UTC m=+253.718804916" observedRunningTime="2026-04-20 19:24:50.644869486 +0000 UTC m=+254.264551787" watchObservedRunningTime="2026-04-20 19:24:50.646144177 +0000 UTC m=+254.265826466" Apr 20 19:25:36.768167 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:25:36.768135 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:25:36.768935 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:25:36.768914 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:25:36.771873 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:25:36.771845 2564 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:26:59.887695 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.887662 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xszmt"] Apr 20 19:26:59.889478 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.889462 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:26:59.892262 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.892246 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:26:59.898596 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.898571 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xszmt"] Apr 20 19:26:59.944393 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.944374 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3875e401-7a70-4f30-84ea-3a7119c9272a-dbus\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:26:59.944511 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.944411 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3875e401-7a70-4f30-84ea-3a7119c9272a-kubelet-config\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:26:59.944511 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:26:59.944444 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3875e401-7a70-4f30-84ea-3a7119c9272a-original-pull-secret\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.045317 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.045279 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3875e401-7a70-4f30-84ea-3a7119c9272a-dbus\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.045432 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.045337 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3875e401-7a70-4f30-84ea-3a7119c9272a-kubelet-config\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.045432 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.045395 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3875e401-7a70-4f30-84ea-3a7119c9272a-original-pull-secret\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.045501 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.045481 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3875e401-7a70-4f30-84ea-3a7119c9272a-kubelet-config\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.045534 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.045497 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3875e401-7a70-4f30-84ea-3a7119c9272a-dbus\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.047587 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.047570 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3875e401-7a70-4f30-84ea-3a7119c9272a-original-pull-secret\") pod \"global-pull-secret-syncer-xszmt\" (UID: \"3875e401-7a70-4f30-84ea-3a7119c9272a\") " pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.198986 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.198904 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xszmt" Apr 20 19:27:00.311336 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.311309 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xszmt"] Apr 20 19:27:00.314120 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:27:00.314098 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3875e401_7a70_4f30_84ea_3a7119c9272a.slice/crio-c4c502c1b66ef5ba41f9a8f563d216d39facec0a6212e23b92e44773be031736 WatchSource:0}: Error finding container c4c502c1b66ef5ba41f9a8f563d216d39facec0a6212e23b92e44773be031736: Status 404 returned error can't find the container with id c4c502c1b66ef5ba41f9a8f563d216d39facec0a6212e23b92e44773be031736 Apr 20 19:27:00.315669 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.315653 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:27:00.963620 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:00.963580 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xszmt" event={"ID":"3875e401-7a70-4f30-84ea-3a7119c9272a","Type":"ContainerStarted","Data":"c4c502c1b66ef5ba41f9a8f563d216d39facec0a6212e23b92e44773be031736"} Apr 20 19:27:04.975363 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:04.975324 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xszmt" event={"ID":"3875e401-7a70-4f30-84ea-3a7119c9272a","Type":"ContainerStarted","Data":"cbda9a72ece2a1259d123005a7b059f7b85bafbb6a1346613e644ae61b34fed2"} Apr 20 19:27:04.994611 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:27:04.994562 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xszmt" podStartSLOduration=2.146698551 podStartE2EDuration="5.994548048s" podCreationTimestamp="2026-04-20 19:26:59 +0000 UTC" firstStartedPulling="2026-04-20 19:27:00.315776217 +0000 UTC m=+383.935458481" lastFinishedPulling="2026-04-20 19:27:04.163625701 +0000 UTC m=+387.783307978" observedRunningTime="2026-04-20 19:27:04.993266984 +0000 UTC m=+388.612949273" watchObservedRunningTime="2026-04-20 19:27:04.994548048 +0000 UTC m=+388.614230335" Apr 20 19:28:01.594686 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.594611 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5"] Apr 20 19:28:01.596437 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.596421 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.598874 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.598850 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:28:01.598958 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.598872 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:28:01.599020 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.598979 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-hw4nn\"" Apr 20 19:28:01.604900 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.604879 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5"] Apr 20 19:28:01.760173 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.760144 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhn9d\" (UniqueName: \"kubernetes.io/projected/a1e35ae5-5204-49e5-9d53-789b656366d2-kube-api-access-nhn9d\") pod \"openshift-lws-operator-bfc7f696d-hcfv5\" (UID: \"a1e35ae5-5204-49e5-9d53-789b656366d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.760329 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.760186 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1e35ae5-5204-49e5-9d53-789b656366d2-tmp\") pod \"openshift-lws-operator-bfc7f696d-hcfv5\" (UID: \"a1e35ae5-5204-49e5-9d53-789b656366d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.861495 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.861411 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1e35ae5-5204-49e5-9d53-789b656366d2-tmp\") pod \"openshift-lws-operator-bfc7f696d-hcfv5\" (UID: \"a1e35ae5-5204-49e5-9d53-789b656366d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.861495 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.861476 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhn9d\" (UniqueName: \"kubernetes.io/projected/a1e35ae5-5204-49e5-9d53-789b656366d2-kube-api-access-nhn9d\") pod \"openshift-lws-operator-bfc7f696d-hcfv5\" (UID: \"a1e35ae5-5204-49e5-9d53-789b656366d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.861775 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.861756 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a1e35ae5-5204-49e5-9d53-789b656366d2-tmp\") pod \"openshift-lws-operator-bfc7f696d-hcfv5\" (UID: \"a1e35ae5-5204-49e5-9d53-789b656366d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.869158 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.869129 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhn9d\" (UniqueName: \"kubernetes.io/projected/a1e35ae5-5204-49e5-9d53-789b656366d2-kube-api-access-nhn9d\") pod \"openshift-lws-operator-bfc7f696d-hcfv5\" (UID: \"a1e35ae5-5204-49e5-9d53-789b656366d2\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:01.905213 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:01.905189 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" Apr 20 19:28:02.024965 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:02.024936 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5"] Apr 20 19:28:02.028004 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:28:02.027964 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda1e35ae5_5204_49e5_9d53_789b656366d2.slice/crio-94303a6c038666dd1dd5ad236a8dc3c367a10d1c1ab0083a8f8c875d27c2250b WatchSource:0}: Error finding container 94303a6c038666dd1dd5ad236a8dc3c367a10d1c1ab0083a8f8c875d27c2250b: Status 404 returned error can't find the container with id 94303a6c038666dd1dd5ad236a8dc3c367a10d1c1ab0083a8f8c875d27c2250b Apr 20 19:28:02.117895 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:02.117829 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" event={"ID":"a1e35ae5-5204-49e5-9d53-789b656366d2","Type":"ContainerStarted","Data":"94303a6c038666dd1dd5ad236a8dc3c367a10d1c1ab0083a8f8c875d27c2250b"} Apr 20 19:28:06.131087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:06.130981 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" event={"ID":"a1e35ae5-5204-49e5-9d53-789b656366d2","Type":"ContainerStarted","Data":"365b5456be357e6d4665b1f95c8b58d5adcf577cb8fc2bf09c1ddf886b7dc9b9"} Apr 20 19:28:06.149438 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:06.149392 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-hcfv5" podStartSLOduration=1.300244582 podStartE2EDuration="5.149378127s" podCreationTimestamp="2026-04-20 19:28:01 +0000 UTC" firstStartedPulling="2026-04-20 19:28:02.029349411 +0000 UTC m=+445.649031676" lastFinishedPulling="2026-04-20 19:28:05.878482953 +0000 UTC m=+449.498165221" observedRunningTime="2026-04-20 19:28:06.147441753 +0000 UTC m=+449.767124042" watchObservedRunningTime="2026-04-20 19:28:06.149378127 +0000 UTC m=+449.769060413" Apr 20 19:28:21.279575 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.279534 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf"] Apr 20 19:28:21.282761 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.282727 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.285593 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.285573 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-pkszx\"" Apr 20 19:28:21.285778 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.285759 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:28:21.285955 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.285944 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:28:21.286059 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.286037 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:28:21.286334 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.286315 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:28:21.297673 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.297645 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf"] Apr 20 19:28:21.383783 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.383755 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbfc9f71-287b-42f9-a950-0903e2b4cd98-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.383889 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.383791 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbfc9f71-287b-42f9-a950-0903e2b4cd98-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.383950 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.383885 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sl9q2\" (UniqueName: \"kubernetes.io/projected/bbfc9f71-287b-42f9-a950-0903e2b4cd98-kube-api-access-sl9q2\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.484451 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.484427 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sl9q2\" (UniqueName: \"kubernetes.io/projected/bbfc9f71-287b-42f9-a950-0903e2b4cd98-kube-api-access-sl9q2\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.484539 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.484461 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbfc9f71-287b-42f9-a950-0903e2b4cd98-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.484539 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.484480 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbfc9f71-287b-42f9-a950-0903e2b4cd98-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.486985 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.486963 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bbfc9f71-287b-42f9-a950-0903e2b4cd98-webhook-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.487097 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.487056 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bbfc9f71-287b-42f9-a950-0903e2b4cd98-apiservice-cert\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.493460 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.493439 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sl9q2\" (UniqueName: \"kubernetes.io/projected/bbfc9f71-287b-42f9-a950-0903e2b4cd98-kube-api-access-sl9q2\") pod \"opendatahub-operator-controller-manager-7875d57869-vt2rf\" (UID: \"bbfc9f71-287b-42f9-a950-0903e2b4cd98\") " pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.592241 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.592220 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:21.715071 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:21.715043 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf"] Apr 20 19:28:21.718424 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:28:21.718395 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbfc9f71_287b_42f9_a950_0903e2b4cd98.slice/crio-1df376f0f303f3db4139293ae9efdbcf65bd59512bb6d48af81a6d261d220e98 WatchSource:0}: Error finding container 1df376f0f303f3db4139293ae9efdbcf65bd59512bb6d48af81a6d261d220e98: Status 404 returned error can't find the container with id 1df376f0f303f3db4139293ae9efdbcf65bd59512bb6d48af81a6d261d220e98 Apr 20 19:28:22.174335 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:22.174293 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" event={"ID":"bbfc9f71-287b-42f9-a950-0903e2b4cd98","Type":"ContainerStarted","Data":"1df376f0f303f3db4139293ae9efdbcf65bd59512bb6d48af81a6d261d220e98"} Apr 20 19:28:24.181287 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:24.181253 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" event={"ID":"bbfc9f71-287b-42f9-a950-0903e2b4cd98","Type":"ContainerStarted","Data":"ffe5fecc25a169045961eb730b99aab5f1bf85b912dfd94f898c3aecbef9d8a6"} Apr 20 19:28:24.181584 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:24.181396 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:24.205836 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:24.205788 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" podStartSLOduration=0.817619785 podStartE2EDuration="3.205774868s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:21.720139639 +0000 UTC m=+465.339821904" lastFinishedPulling="2026-04-20 19:28:24.108294711 +0000 UTC m=+467.727976987" observedRunningTime="2026-04-20 19:28:24.202838768 +0000 UTC m=+467.822521055" watchObservedRunningTime="2026-04-20 19:28:24.205774868 +0000 UTC m=+467.825457155" Apr 20 19:28:35.186835 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:35.186803 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-7875d57869-vt2rf" Apr 20 19:28:39.595338 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.595301 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw"] Apr 20 19:28:39.602357 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.602330 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.604762 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.604742 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:28:39.605767 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.605738 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-s5rpw\"" Apr 20 19:28:39.605869 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.605809 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:28:39.605869 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.605812 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 19:28:39.606432 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.606409 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 19:28:39.608166 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.608147 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw"] Apr 20 19:28:39.713040 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.712979 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b637c838-b115-4a7f-8513-b59711277667-tmp\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.713040 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.713046 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plknw\" (UniqueName: \"kubernetes.io/projected/b637c838-b115-4a7f-8513-b59711277667-kube-api-access-plknw\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.713275 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.713083 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b637c838-b115-4a7f-8513-b59711277667-tls-certs\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.814414 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.814381 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b637c838-b115-4a7f-8513-b59711277667-tls-certs\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.814537 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.814441 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b637c838-b115-4a7f-8513-b59711277667-tmp\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.814537 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.814465 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-plknw\" (UniqueName: \"kubernetes.io/projected/b637c838-b115-4a7f-8513-b59711277667-kube-api-access-plknw\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.816684 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.816662 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/b637c838-b115-4a7f-8513-b59711277667-tmp\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.816910 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.816888 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b637c838-b115-4a7f-8513-b59711277667-tls-certs\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.822924 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.822900 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-plknw\" (UniqueName: \"kubernetes.io/projected/b637c838-b115-4a7f-8513-b59711277667-kube-api-access-plknw\") pod \"kube-auth-proxy-65b68d668c-qqcmw\" (UID: \"b637c838-b115-4a7f-8513-b59711277667\") " pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:39.912345 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:39.912269 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" Apr 20 19:28:40.032147 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:40.032119 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw"] Apr 20 19:28:40.032653 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:28:40.032620 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb637c838_b115_4a7f_8513_b59711277667.slice/crio-67ee930f36cc5db33c6d49b962957c50a0565e152a847a4dab25383a4ec32d08 WatchSource:0}: Error finding container 67ee930f36cc5db33c6d49b962957c50a0565e152a847a4dab25383a4ec32d08: Status 404 returned error can't find the container with id 67ee930f36cc5db33c6d49b962957c50a0565e152a847a4dab25383a4ec32d08 Apr 20 19:28:40.226347 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:40.226273 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" event={"ID":"b637c838-b115-4a7f-8513-b59711277667","Type":"ContainerStarted","Data":"67ee930f36cc5db33c6d49b962957c50a0565e152a847a4dab25383a4ec32d08"} Apr 20 19:28:42.289486 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.289444 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-pxsf7"] Apr 20 19:28:42.292575 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.292554 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.295170 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.295149 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-lbwwn\"" Apr 20 19:28:42.295686 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.295670 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 19:28:42.303254 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.303227 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-pxsf7"] Apr 20 19:28:42.333978 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.333942 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-cert\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.334184 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.334111 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xw4kn\" (UniqueName: \"kubernetes.io/projected/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-kube-api-access-xw4kn\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.435409 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.435378 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xw4kn\" (UniqueName: \"kubernetes.io/projected/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-kube-api-access-xw4kn\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.435567 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.435415 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-cert\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.435567 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:42.435529 2564 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 19:28:42.435673 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:42.435583 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-cert podName:6f1ce413-5690-4f7b-b12c-bf1b2d1805be nodeName:}" failed. No retries permitted until 2026-04-20 19:28:42.935566567 +0000 UTC m=+486.555248832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-cert") pod "odh-model-controller-858dbf95b8-pxsf7" (UID: "6f1ce413-5690-4f7b-b12c-bf1b2d1805be") : secret "odh-model-controller-webhook-cert" not found Apr 20 19:28:42.447337 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.447305 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xw4kn\" (UniqueName: \"kubernetes.io/projected/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-kube-api-access-xw4kn\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.939961 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.939926 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-cert\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:42.943122 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:42.943092 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6f1ce413-5690-4f7b-b12c-bf1b2d1805be-cert\") pod \"odh-model-controller-858dbf95b8-pxsf7\" (UID: \"6f1ce413-5690-4f7b-b12c-bf1b2d1805be\") " pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:43.205060 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:43.204961 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:43.611854 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:43.611740 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-pxsf7"] Apr 20 19:28:43.614247 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:28:43.614220 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f1ce413_5690_4f7b_b12c_bf1b2d1805be.slice/crio-961eef7d80d8343bee0d2e3be65c63a761398fa0299d462c3bb9c8ed2cc9422d WatchSource:0}: Error finding container 961eef7d80d8343bee0d2e3be65c63a761398fa0299d462c3bb9c8ed2cc9422d: Status 404 returned error can't find the container with id 961eef7d80d8343bee0d2e3be65c63a761398fa0299d462c3bb9c8ed2cc9422d Apr 20 19:28:44.247043 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:44.246981 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" event={"ID":"6f1ce413-5690-4f7b-b12c-bf1b2d1805be","Type":"ContainerStarted","Data":"961eef7d80d8343bee0d2e3be65c63a761398fa0299d462c3bb9c8ed2cc9422d"} Apr 20 19:28:44.249185 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:44.249130 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" event={"ID":"b637c838-b115-4a7f-8513-b59711277667","Type":"ContainerStarted","Data":"0fe061cbbc52433c9d27531a673c3a260bda2e48d7c8423f972ce75e0a8fbf86"} Apr 20 19:28:44.267838 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:44.266353 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-65b68d668c-qqcmw" podStartSLOduration=1.760474758 podStartE2EDuration="5.266337121s" podCreationTimestamp="2026-04-20 19:28:39 +0000 UTC" firstStartedPulling="2026-04-20 19:28:40.035235254 +0000 UTC m=+483.654917519" lastFinishedPulling="2026-04-20 19:28:43.5410976 +0000 UTC m=+487.160779882" observedRunningTime="2026-04-20 19:28:44.2661964 +0000 UTC m=+487.885878688" watchObservedRunningTime="2026-04-20 19:28:44.266337121 +0000 UTC m=+487.886019410" Apr 20 19:28:47.259440 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:47.259401 2564 generic.go:358] "Generic (PLEG): container finished" podID="6f1ce413-5690-4f7b-b12c-bf1b2d1805be" containerID="266388d151cb4dabc278b76a650356b47a63dbf25ffbd989958c3dba67c53ad1" exitCode=1 Apr 20 19:28:47.259843 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:47.259485 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" event={"ID":"6f1ce413-5690-4f7b-b12c-bf1b2d1805be","Type":"ContainerDied","Data":"266388d151cb4dabc278b76a650356b47a63dbf25ffbd989958c3dba67c53ad1"} Apr 20 19:28:47.259843 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:47.259697 2564 scope.go:117] "RemoveContainer" containerID="266388d151cb4dabc278b76a650356b47a63dbf25ffbd989958c3dba67c53ad1" Apr 20 19:28:48.266256 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:48.266223 2564 generic.go:358] "Generic (PLEG): container finished" podID="6f1ce413-5690-4f7b-b12c-bf1b2d1805be" containerID="f244721b8a15a643cb6bfda32a43e3a901fd2ecce16fdf49e2462a16d9356c84" exitCode=1 Apr 20 19:28:48.266685 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:48.266304 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" event={"ID":"6f1ce413-5690-4f7b-b12c-bf1b2d1805be","Type":"ContainerDied","Data":"f244721b8a15a643cb6bfda32a43e3a901fd2ecce16fdf49e2462a16d9356c84"} Apr 20 19:28:48.266685 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:48.266359 2564 scope.go:117] "RemoveContainer" containerID="266388d151cb4dabc278b76a650356b47a63dbf25ffbd989958c3dba67c53ad1" Apr 20 19:28:48.266685 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:48.266540 2564 scope.go:117] "RemoveContainer" containerID="f244721b8a15a643cb6bfda32a43e3a901fd2ecce16fdf49e2462a16d9356c84" Apr 20 19:28:48.266804 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:48.266733 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-pxsf7_opendatahub(6f1ce413-5690-4f7b-b12c-bf1b2d1805be)\"" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" podUID="6f1ce413-5690-4f7b-b12c-bf1b2d1805be" Apr 20 19:28:49.270625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.270592 2564 scope.go:117] "RemoveContainer" containerID="f244721b8a15a643cb6bfda32a43e3a901fd2ecce16fdf49e2462a16d9356c84" Apr 20 19:28:49.271033 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:49.270782 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-pxsf7_opendatahub(6f1ce413-5690-4f7b-b12c-bf1b2d1805be)\"" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" podUID="6f1ce413-5690-4f7b-b12c-bf1b2d1805be" Apr 20 19:28:49.661466 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.661429 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-th4qn"] Apr 20 19:28:49.666175 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.666149 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.669489 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.669455 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 19:28:49.670505 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.670484 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-m9jdk\"" Apr 20 19:28:49.670617 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.670595 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 19:28:49.695908 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.695885 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-th4qn"] Apr 20 19:28:49.792561 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.792530 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chsx8\" (UniqueName: \"kubernetes.io/projected/5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6-kube-api-access-chsx8\") pod \"servicemesh-operator3-55f49c5f94-th4qn\" (UID: \"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.792709 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.792567 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-th4qn\" (UID: \"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.893253 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.893223 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chsx8\" (UniqueName: \"kubernetes.io/projected/5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6-kube-api-access-chsx8\") pod \"servicemesh-operator3-55f49c5f94-th4qn\" (UID: \"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.893363 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.893262 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-th4qn\" (UID: \"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.895683 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.895653 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6-operator-config\") pod \"servicemesh-operator3-55f49c5f94-th4qn\" (UID: \"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.905562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.905540 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chsx8\" (UniqueName: \"kubernetes.io/projected/5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6-kube-api-access-chsx8\") pod \"servicemesh-operator3-55f49c5f94-th4qn\" (UID: \"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:49.975121 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:49.975069 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:50.120914 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.120819 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-th4qn"] Apr 20 19:28:50.123085 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:28:50.123058 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fa1b0e7_2083_4b46_8aeb_8c6aea4aa0a6.slice/crio-0c7d2bb0f6e21ede3fc16548543594146d59931c4b941d85c02f796ecab1f168 WatchSource:0}: Error finding container 0c7d2bb0f6e21ede3fc16548543594146d59931c4b941d85c02f796ecab1f168: Status 404 returned error can't find the container with id 0c7d2bb0f6e21ede3fc16548543594146d59931c4b941d85c02f796ecab1f168 Apr 20 19:28:50.275208 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.275140 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" event={"ID":"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6","Type":"ContainerStarted","Data":"0c7d2bb0f6e21ede3fc16548543594146d59931c4b941d85c02f796ecab1f168"} Apr 20 19:28:50.312284 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.312258 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mtf2s"] Apr 20 19:28:50.316648 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.316623 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:50.320197 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.320175 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-pwcbw\"" Apr 20 19:28:50.320440 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.320397 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 19:28:50.331840 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.331818 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mtf2s"] Apr 20 19:28:50.397299 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.397273 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b6fe48f-a6fd-414a-bc81-d38839becb12-cert\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:50.397410 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.397324 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbcm8\" (UniqueName: \"kubernetes.io/projected/9b6fe48f-a6fd-414a-bc81-d38839becb12-kube-api-access-sbcm8\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:50.498276 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.498249 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b6fe48f-a6fd-414a-bc81-d38839becb12-cert\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:50.498373 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.498316 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbcm8\" (UniqueName: \"kubernetes.io/projected/9b6fe48f-a6fd-414a-bc81-d38839becb12-kube-api-access-sbcm8\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:50.498418 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:50.498387 2564 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 20 19:28:50.498456 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:50.498442 2564 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b6fe48f-a6fd-414a-bc81-d38839becb12-cert podName:9b6fe48f-a6fd-414a-bc81-d38839becb12 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:50.998428875 +0000 UTC m=+494.618111140 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/9b6fe48f-a6fd-414a-bc81-d38839becb12-cert") pod "kserve-controller-manager-856948b99f-mtf2s" (UID: "9b6fe48f-a6fd-414a-bc81-d38839becb12") : secret "kserve-webhook-server-cert" not found Apr 20 19:28:50.523473 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:50.523450 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbcm8\" (UniqueName: \"kubernetes.io/projected/9b6fe48f-a6fd-414a-bc81-d38839becb12-kube-api-access-sbcm8\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:51.002489 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:51.002424 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b6fe48f-a6fd-414a-bc81-d38839becb12-cert\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:51.005372 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:51.005336 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9b6fe48f-a6fd-414a-bc81-d38839becb12-cert\") pod \"kserve-controller-manager-856948b99f-mtf2s\" (UID: \"9b6fe48f-a6fd-414a-bc81-d38839becb12\") " pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:51.229500 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:51.229451 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:51.378851 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:51.378814 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-mtf2s"] Apr 20 19:28:52.283908 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:52.283872 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" event={"ID":"9b6fe48f-a6fd-414a-bc81-d38839becb12","Type":"ContainerStarted","Data":"a590a80b9f1e31669e779516a6889a646900f7b5ad5eeaf697b98546a2f0ff1b"} Apr 20 19:28:53.205674 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:53.205637 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:28:53.206045 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:53.206026 2564 scope.go:117] "RemoveContainer" containerID="f244721b8a15a643cb6bfda32a43e3a901fd2ecce16fdf49e2462a16d9356c84" Apr 20 19:28:53.206200 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:28:53.206184 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-pxsf7_opendatahub(6f1ce413-5690-4f7b-b12c-bf1b2d1805be)\"" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" podUID="6f1ce413-5690-4f7b-b12c-bf1b2d1805be" Apr 20 19:28:53.289128 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:53.289094 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" event={"ID":"5fa1b0e7-2083-4b46-8aeb-8c6aea4aa0a6","Type":"ContainerStarted","Data":"eacca44f2a098cf573dc60a1fa0467d70bfaf3c69f9b239a0695bb058f416ea1"} Apr 20 19:28:53.289278 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:53.289227 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:28:53.314170 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:53.314111 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" podStartSLOduration=1.9123963000000002 podStartE2EDuration="4.314092227s" podCreationTimestamp="2026-04-20 19:28:49 +0000 UTC" firstStartedPulling="2026-04-20 19:28:50.126082842 +0000 UTC m=+493.745765112" lastFinishedPulling="2026-04-20 19:28:52.527778771 +0000 UTC m=+496.147461039" observedRunningTime="2026-04-20 19:28:53.31241298 +0000 UTC m=+496.932095271" watchObservedRunningTime="2026-04-20 19:28:53.314092227 +0000 UTC m=+496.933774517" Apr 20 19:28:55.297253 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:55.297220 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" event={"ID":"9b6fe48f-a6fd-414a-bc81-d38839becb12","Type":"ContainerStarted","Data":"b0460947d7ca0374aca6ac0c2fd8da6bbab807244f658d6ee4f68f86d10e1b16"} Apr 20 19:28:55.297619 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:55.297289 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:28:55.314027 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:28:55.313957 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" podStartSLOduration=2.164299555 podStartE2EDuration="5.31393648s" podCreationTimestamp="2026-04-20 19:28:50 +0000 UTC" firstStartedPulling="2026-04-20 19:28:51.384829204 +0000 UTC m=+495.004511469" lastFinishedPulling="2026-04-20 19:28:54.534466129 +0000 UTC m=+498.154148394" observedRunningTime="2026-04-20 19:28:55.313572057 +0000 UTC m=+498.933254344" watchObservedRunningTime="2026-04-20 19:28:55.31393648 +0000 UTC m=+498.933618767" Apr 20 19:29:03.205634 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:03.205597 2564 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:29:03.205983 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:03.205945 2564 scope.go:117] "RemoveContainer" containerID="f244721b8a15a643cb6bfda32a43e3a901fd2ecce16fdf49e2462a16d9356c84" Apr 20 19:29:04.295594 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:04.295561 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-th4qn" Apr 20 19:29:04.328083 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:04.328041 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" event={"ID":"6f1ce413-5690-4f7b-b12c-bf1b2d1805be","Type":"ContainerStarted","Data":"edd5e8f421d1a64392df4d18566f6bf18194e59271e279226af21053cfd822c1"} Apr 20 19:29:04.328396 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:04.328378 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:29:04.349390 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:04.349315 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" podStartSLOduration=2.500422652 podStartE2EDuration="22.349297565s" podCreationTimestamp="2026-04-20 19:28:42 +0000 UTC" firstStartedPulling="2026-04-20 19:28:43.615516012 +0000 UTC m=+487.235198279" lastFinishedPulling="2026-04-20 19:29:03.464390927 +0000 UTC m=+507.084073192" observedRunningTime="2026-04-20 19:29:04.347792458 +0000 UTC m=+507.967474747" watchObservedRunningTime="2026-04-20 19:29:04.349297565 +0000 UTC m=+507.968979852" Apr 20 19:29:13.951465 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.951428 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8"] Apr 20 19:29:13.954822 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.954803 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:13.957486 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.957443 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 19:29:13.957486 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.957453 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 19:29:13.957659 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.957467 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 19:29:13.957711 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.957696 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 19:29:13.957852 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.957835 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-zczvk\"" Apr 20 19:29:13.969817 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:13.969799 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8"] Apr 20 19:29:14.075292 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075265 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.075428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075313 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.075428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075336 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.075428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075357 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.075428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075390 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.075428 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075421 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.075580 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.075439 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qrt6\" (UniqueName: \"kubernetes.io/projected/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-kube-api-access-2qrt6\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.176883 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.176840 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177062 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.176910 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177062 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.176943 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2qrt6\" (UniqueName: \"kubernetes.io/projected/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-kube-api-access-2qrt6\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177062 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.177031 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177234 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.177080 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177234 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.177107 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177234 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.177135 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.177629 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.177593 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.179273 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.179248 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.179479 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.179459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.179553 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.179519 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.179639 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.179621 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.185387 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.185362 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.185474 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.185459 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qrt6\" (UniqueName: \"kubernetes.io/projected/4afa2c06-bd15-4b40-8ad0-bf2401e8b782-kube-api-access-2qrt6\") pod \"istiod-openshift-gateway-55ff986f96-xv4n8\" (UID: \"4afa2c06-bd15-4b40-8ad0-bf2401e8b782\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.264747 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.264680 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:14.391400 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:14.391372 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8"] Apr 20 19:29:14.394506 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:29:14.394476 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4afa2c06_bd15_4b40_8ad0_bf2401e8b782.slice/crio-bb31f2ef2b98af7ac45149e2ee4a1d17708e0766f24f670eef0d89929984d614 WatchSource:0}: Error finding container bb31f2ef2b98af7ac45149e2ee4a1d17708e0766f24f670eef0d89929984d614: Status 404 returned error can't find the container with id bb31f2ef2b98af7ac45149e2ee4a1d17708e0766f24f670eef0d89929984d614 Apr 20 19:29:15.334225 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:15.334191 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-pxsf7" Apr 20 19:29:15.365653 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:15.365610 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" event={"ID":"4afa2c06-bd15-4b40-8ad0-bf2401e8b782","Type":"ContainerStarted","Data":"bb31f2ef2b98af7ac45149e2ee4a1d17708e0766f24f670eef0d89929984d614"} Apr 20 19:29:18.564437 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:18.564399 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 20 19:29:18.564828 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:18.564478 2564 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236220Ki","pods":"250"} Apr 20 19:29:19.382310 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:19.382276 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" event={"ID":"4afa2c06-bd15-4b40-8ad0-bf2401e8b782","Type":"ContainerStarted","Data":"68494ea9023319dbc8259f99febb4bffa8c41a074723eb768382a2431822d292"} Apr 20 19:29:19.382505 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:19.382451 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:19.383797 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:19.383762 2564 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-xv4n8 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 19:29:19.383895 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:19.383819 2564 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" podUID="4afa2c06-bd15-4b40-8ad0-bf2401e8b782" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:29:19.403534 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:19.403494 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" podStartSLOduration=2.235701131 podStartE2EDuration="6.403480555s" podCreationTimestamp="2026-04-20 19:29:13 +0000 UTC" firstStartedPulling="2026-04-20 19:29:14.396430767 +0000 UTC m=+518.016113031" lastFinishedPulling="2026-04-20 19:29:18.564210188 +0000 UTC m=+522.183892455" observedRunningTime="2026-04-20 19:29:19.401855986 +0000 UTC m=+523.021538272" watchObservedRunningTime="2026-04-20 19:29:19.403480555 +0000 UTC m=+523.023162843" Apr 20 19:29:20.386516 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:20.386484 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-xv4n8" Apr 20 19:29:26.306317 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:26.306241 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-mtf2s" Apr 20 19:29:39.330233 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.330200 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-c5b24"] Apr 20 19:29:39.333235 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.333215 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:39.335681 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.335652 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:29:39.335681 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.335672 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:29:39.336516 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.336494 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-s8bfx\"" Apr 20 19:29:39.340285 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.340259 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-c5b24"] Apr 20 19:29:39.348486 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.348461 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s68xd\" (UniqueName: \"kubernetes.io/projected/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33-kube-api-access-s68xd\") pod \"kuadrant-operator-catalog-c5b24\" (UID: \"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33\") " pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:39.449161 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.449129 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s68xd\" (UniqueName: \"kubernetes.io/projected/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33-kube-api-access-s68xd\") pod \"kuadrant-operator-catalog-c5b24\" (UID: \"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33\") " pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:39.457395 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.457369 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s68xd\" (UniqueName: \"kubernetes.io/projected/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33-kube-api-access-s68xd\") pod \"kuadrant-operator-catalog-c5b24\" (UID: \"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33\") " pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:39.643538 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.643452 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:39.699482 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.699449 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-c5b24"] Apr 20 19:29:39.762697 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:39.762612 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-c5b24"] Apr 20 19:29:39.764956 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:29:39.764921 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4142cec9_8159_4d51_9a2a_b6d3fbcf0b33.slice/crio-521d103fc2c7e0d6f13e6402c29aec84c503f34ffe314432e56d3601b4973675 WatchSource:0}: Error finding container 521d103fc2c7e0d6f13e6402c29aec84c503f34ffe314432e56d3601b4973675: Status 404 returned error can't find the container with id 521d103fc2c7e0d6f13e6402c29aec84c503f34ffe314432e56d3601b4973675 Apr 20 19:29:40.449842 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:40.449811 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" event={"ID":"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33","Type":"ContainerStarted","Data":"521d103fc2c7e0d6f13e6402c29aec84c503f34ffe314432e56d3601b4973675"} Apr 20 19:29:42.458254 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.458219 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" event={"ID":"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33","Type":"ContainerStarted","Data":"bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59"} Apr 20 19:29:42.458656 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.458342 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" podUID="4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" containerName="registry-server" containerID="cri-o://bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59" gracePeriod=2 Apr 20 19:29:42.472852 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.472807 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" podStartSLOduration=1.302819742 podStartE2EDuration="3.472790458s" podCreationTimestamp="2026-04-20 19:29:39 +0000 UTC" firstStartedPulling="2026-04-20 19:29:39.766793602 +0000 UTC m=+543.386475867" lastFinishedPulling="2026-04-20 19:29:41.936764319 +0000 UTC m=+545.556446583" observedRunningTime="2026-04-20 19:29:42.471959541 +0000 UTC m=+546.091641865" watchObservedRunningTime="2026-04-20 19:29:42.472790458 +0000 UTC m=+546.092472748" Apr 20 19:29:42.694069 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.694040 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:42.774891 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.774807 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s68xd\" (UniqueName: \"kubernetes.io/projected/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33-kube-api-access-s68xd\") pod \"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33\" (UID: \"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33\") " Apr 20 19:29:42.777101 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.777072 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33-kube-api-access-s68xd" (OuterVolumeSpecName: "kube-api-access-s68xd") pod "4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" (UID: "4142cec9-8159-4d51-9a2a-b6d3fbcf0b33"). InnerVolumeSpecName "kube-api-access-s68xd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:42.876249 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:42.876216 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-s68xd\" (UniqueName: \"kubernetes.io/projected/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33-kube-api-access-s68xd\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:29:43.461842 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.461805 2564 generic.go:358] "Generic (PLEG): container finished" podID="4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" containerID="bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59" exitCode=0 Apr 20 19:29:43.462267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.461889 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" Apr 20 19:29:43.462267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.461888 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" event={"ID":"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33","Type":"ContainerDied","Data":"bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59"} Apr 20 19:29:43.462267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.461930 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-c5b24" event={"ID":"4142cec9-8159-4d51-9a2a-b6d3fbcf0b33","Type":"ContainerDied","Data":"521d103fc2c7e0d6f13e6402c29aec84c503f34ffe314432e56d3601b4973675"} Apr 20 19:29:43.462267 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.461945 2564 scope.go:117] "RemoveContainer" containerID="bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59" Apr 20 19:29:43.470586 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.470566 2564 scope.go:117] "RemoveContainer" containerID="bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59" Apr 20 19:29:43.470828 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:29:43.470811 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59\": container with ID starting with bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59 not found: ID does not exist" containerID="bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59" Apr 20 19:29:43.470872 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.470837 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59"} err="failed to get container status \"bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59\": rpc error: code = NotFound desc = could not find container \"bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59\": container with ID starting with bba89e7f53b7d1407da5797b7673b5380f3dd2c5ceb03f2ea69124806c313c59 not found: ID does not exist" Apr 20 19:29:43.476528 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.476507 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-c5b24"] Apr 20 19:29:43.479811 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:43.479789 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-c5b24"] Apr 20 19:29:44.885287 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:29:44.885253 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" path="/var/lib/kubelet/pods/4142cec9-8159-4d51-9a2a-b6d3fbcf0b33/volumes" Apr 20 19:30:10.731689 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.731649 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j"] Apr 20 19:30:10.732167 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.731930 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" containerName="registry-server" Apr 20 19:30:10.732167 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.731942 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" containerName="registry-server" Apr 20 19:30:10.732167 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.732014 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="4142cec9-8159-4d51-9a2a-b6d3fbcf0b33" containerName="registry-server" Apr 20 19:30:10.734892 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.734875 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:10.739468 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.739447 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-xc42s\"" Apr 20 19:30:10.739468 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.739456 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:30:10.740342 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.740323 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 20 19:30:10.740435 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.740343 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:30:10.746798 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.746776 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j"] Apr 20 19:30:10.782037 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.781980 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ggl\" (UniqueName: \"kubernetes.io/projected/304440ae-0748-4cd1-864a-2fecc74a4a70-kube-api-access-l4ggl\") pod \"dns-operator-controller-manager-648d5c98bc-kxn6j\" (UID: \"304440ae-0748-4cd1-864a-2fecc74a4a70\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:10.882596 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.882562 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l4ggl\" (UniqueName: \"kubernetes.io/projected/304440ae-0748-4cd1-864a-2fecc74a4a70-kube-api-access-l4ggl\") pod \"dns-operator-controller-manager-648d5c98bc-kxn6j\" (UID: \"304440ae-0748-4cd1-864a-2fecc74a4a70\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:10.894468 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:10.894435 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4ggl\" (UniqueName: \"kubernetes.io/projected/304440ae-0748-4cd1-864a-2fecc74a4a70-kube-api-access-l4ggl\") pod \"dns-operator-controller-manager-648d5c98bc-kxn6j\" (UID: \"304440ae-0748-4cd1-864a-2fecc74a4a70\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:11.044486 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:11.044457 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:11.160915 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:11.160849 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j"] Apr 20 19:30:11.163806 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:30:11.163778 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod304440ae_0748_4cd1_864a_2fecc74a4a70.slice/crio-d7fb7cc7ab1e1519d43ad6d3d2e2b14e23bd3187916821a1de4a1caad2965b9a WatchSource:0}: Error finding container d7fb7cc7ab1e1519d43ad6d3d2e2b14e23bd3187916821a1de4a1caad2965b9a: Status 404 returned error can't find the container with id d7fb7cc7ab1e1519d43ad6d3d2e2b14e23bd3187916821a1de4a1caad2965b9a Apr 20 19:30:11.549113 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:11.549078 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" event={"ID":"304440ae-0748-4cd1-864a-2fecc74a4a70","Type":"ContainerStarted","Data":"d7fb7cc7ab1e1519d43ad6d3d2e2b14e23bd3187916821a1de4a1caad2965b9a"} Apr 20 19:30:14.560658 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.560622 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" event={"ID":"304440ae-0748-4cd1-864a-2fecc74a4a70","Type":"ContainerStarted","Data":"3896f173f26442fe2941baf143d5d023c04612ccc4fe5bebc46cd839e8f2d110"} Apr 20 19:30:14.561040 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.560682 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:14.578900 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.578855 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" podStartSLOduration=2.073895607 podStartE2EDuration="4.578842012s" podCreationTimestamp="2026-04-20 19:30:10 +0000 UTC" firstStartedPulling="2026-04-20 19:30:11.166165232 +0000 UTC m=+574.785847496" lastFinishedPulling="2026-04-20 19:30:13.671111631 +0000 UTC m=+577.290793901" observedRunningTime="2026-04-20 19:30:14.577270051 +0000 UTC m=+578.196952335" watchObservedRunningTime="2026-04-20 19:30:14.578842012 +0000 UTC m=+578.198524299" Apr 20 19:30:14.843814 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.843751 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-jq4p7"] Apr 20 19:30:14.847127 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.847109 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:14.850112 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.850094 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-t8sfq\"" Apr 20 19:30:14.866577 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.866549 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-jq4p7"] Apr 20 19:30:14.913874 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:14.913848 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n7k5w\" (UniqueName: \"kubernetes.io/projected/fba780a4-2deb-431d-8bad-3c9f2c0a3b5c-kube-api-access-n7k5w\") pod \"authorino-operator-657f44b778-jq4p7\" (UID: \"fba780a4-2deb-431d-8bad-3c9f2c0a3b5c\") " pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:15.015276 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:15.015249 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n7k5w\" (UniqueName: \"kubernetes.io/projected/fba780a4-2deb-431d-8bad-3c9f2c0a3b5c-kube-api-access-n7k5w\") pod \"authorino-operator-657f44b778-jq4p7\" (UID: \"fba780a4-2deb-431d-8bad-3c9f2c0a3b5c\") " pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:15.035395 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:15.035373 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n7k5w\" (UniqueName: \"kubernetes.io/projected/fba780a4-2deb-431d-8bad-3c9f2c0a3b5c-kube-api-access-n7k5w\") pod \"authorino-operator-657f44b778-jq4p7\" (UID: \"fba780a4-2deb-431d-8bad-3c9f2c0a3b5c\") " pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:15.157385 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:15.157320 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:15.293004 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:15.292859 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-jq4p7"] Apr 20 19:30:15.295547 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:30:15.295518 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfba780a4_2deb_431d_8bad_3c9f2c0a3b5c.slice/crio-430b9c36a40817d17a69f4e77ed499bb46e5e4a8e2dca885eb66b7e06d4d112e WatchSource:0}: Error finding container 430b9c36a40817d17a69f4e77ed499bb46e5e4a8e2dca885eb66b7e06d4d112e: Status 404 returned error can't find the container with id 430b9c36a40817d17a69f4e77ed499bb46e5e4a8e2dca885eb66b7e06d4d112e Apr 20 19:30:15.564368 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:15.564336 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" event={"ID":"fba780a4-2deb-431d-8bad-3c9f2c0a3b5c","Type":"ContainerStarted","Data":"430b9c36a40817d17a69f4e77ed499bb46e5e4a8e2dca885eb66b7e06d4d112e"} Apr 20 19:30:17.571089 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:17.571053 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" event={"ID":"fba780a4-2deb-431d-8bad-3c9f2c0a3b5c","Type":"ContainerStarted","Data":"22ca9f498908cb9b9eed67c7a81175462ef4ef26424240427a0c351d80c367d7"} Apr 20 19:30:17.575696 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:17.571690 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:17.591298 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:17.591250 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" podStartSLOduration=2.033003554 podStartE2EDuration="3.591236494s" podCreationTimestamp="2026-04-20 19:30:14 +0000 UTC" firstStartedPulling="2026-04-20 19:30:15.297510983 +0000 UTC m=+578.917193251" lastFinishedPulling="2026-04-20 19:30:16.855743923 +0000 UTC m=+580.475426191" observedRunningTime="2026-04-20 19:30:17.590374417 +0000 UTC m=+581.210056707" watchObservedRunningTime="2026-04-20 19:30:17.591236494 +0000 UTC m=+581.210918780" Apr 20 19:30:22.712785 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.712744 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5"] Apr 20 19:30:22.716115 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.716089 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:22.718632 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.718607 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-6jrbg\"" Apr 20 19:30:22.727212 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.727187 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5"] Apr 20 19:30:22.773734 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.773705 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j4x5h\" (UniqueName: \"kubernetes.io/projected/7580a483-cb2c-453f-8375-268383cdad11-kube-api-access-j4x5h\") pod \"limitador-operator-controller-manager-85c4996f8c-rvsr5\" (UID: \"7580a483-cb2c-453f-8375-268383cdad11\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:22.874429 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.874402 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j4x5h\" (UniqueName: \"kubernetes.io/projected/7580a483-cb2c-453f-8375-268383cdad11-kube-api-access-j4x5h\") pod \"limitador-operator-controller-manager-85c4996f8c-rvsr5\" (UID: \"7580a483-cb2c-453f-8375-268383cdad11\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:22.883290 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:22.883264 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j4x5h\" (UniqueName: \"kubernetes.io/projected/7580a483-cb2c-453f-8375-268383cdad11-kube-api-access-j4x5h\") pod \"limitador-operator-controller-manager-85c4996f8c-rvsr5\" (UID: \"7580a483-cb2c-453f-8375-268383cdad11\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:23.026912 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:23.026849 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:23.143348 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:23.143315 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5"] Apr 20 19:30:23.146951 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:30:23.146924 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7580a483_cb2c_453f_8375_268383cdad11.slice/crio-8cb2f38a805d6a8481a0b14ad1802b5812a04dc0d8ad39ced411c1123b21871c WatchSource:0}: Error finding container 8cb2f38a805d6a8481a0b14ad1802b5812a04dc0d8ad39ced411c1123b21871c: Status 404 returned error can't find the container with id 8cb2f38a805d6a8481a0b14ad1802b5812a04dc0d8ad39ced411c1123b21871c Apr 20 19:30:23.592747 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:23.592711 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" event={"ID":"7580a483-cb2c-453f-8375-268383cdad11","Type":"ContainerStarted","Data":"8cb2f38a805d6a8481a0b14ad1802b5812a04dc0d8ad39ced411c1123b21871c"} Apr 20 19:30:25.566555 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:25.566529 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-kxn6j" Apr 20 19:30:25.600716 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:25.600682 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" event={"ID":"7580a483-cb2c-453f-8375-268383cdad11","Type":"ContainerStarted","Data":"9bf97af2f1facf163804307f72018fff9931a69df5232affd2dc26c43654515a"} Apr 20 19:30:25.600895 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:25.600734 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:25.626316 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:25.626259 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" podStartSLOduration=1.662379922 podStartE2EDuration="3.626241395s" podCreationTimestamp="2026-04-20 19:30:22 +0000 UTC" firstStartedPulling="2026-04-20 19:30:23.14871067 +0000 UTC m=+586.768392934" lastFinishedPulling="2026-04-20 19:30:25.112572138 +0000 UTC m=+588.732254407" observedRunningTime="2026-04-20 19:30:25.624857877 +0000 UTC m=+589.244540163" watchObservedRunningTime="2026-04-20 19:30:25.626241395 +0000 UTC m=+589.245923681" Apr 20 19:30:28.576725 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:28.576688 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-jq4p7" Apr 20 19:30:36.606782 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:36.606752 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:36.791102 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:36.790821 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:30:36.791102 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:36.790827 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:30:38.509201 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.509169 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5"] Apr 20 19:30:38.509591 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.509393 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" podUID="7580a483-cb2c-453f-8375-268383cdad11" containerName="manager" containerID="cri-o://9bf97af2f1facf163804307f72018fff9931a69df5232affd2dc26c43654515a" gracePeriod=2 Apr 20 19:30:38.520218 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.520190 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5"] Apr 20 19:30:38.539698 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.539674 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25"] Apr 20 19:30:38.539980 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.539967 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7580a483-cb2c-453f-8375-268383cdad11" containerName="manager" Apr 20 19:30:38.540046 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.539984 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="7580a483-cb2c-453f-8375-268383cdad11" containerName="manager" Apr 20 19:30:38.540082 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.540063 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="7580a483-cb2c-453f-8375-268383cdad11" containerName="manager" Apr 20 19:30:38.542870 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.542853 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:30:38.555912 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.555880 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25"] Apr 20 19:30:38.646620 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.646588 2564 generic.go:358] "Generic (PLEG): container finished" podID="7580a483-cb2c-453f-8375-268383cdad11" containerID="9bf97af2f1facf163804307f72018fff9931a69df5232affd2dc26c43654515a" exitCode=0 Apr 20 19:30:38.686713 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.686688 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmxpb\" (UniqueName: \"kubernetes.io/projected/15c73260-2002-469a-8b4f-457a8253449f-kube-api-access-qmxpb\") pod \"limitador-operator-controller-manager-85c4996f8c-57q25\" (UID: \"15c73260-2002-469a-8b4f-457a8253449f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:30:38.736288 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.736267 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:38.738942 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.738917 2564 status_manager.go:895] "Failed to get status for pod" podUID="7580a483-cb2c-453f-8375-268383cdad11" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" err="pods \"limitador-operator-controller-manager-85c4996f8c-rvsr5\" is forbidden: User \"system:node:ip-10-0-129-98.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-129-98.ec2.internal' and this object" Apr 20 19:30:38.788145 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.788067 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qmxpb\" (UniqueName: \"kubernetes.io/projected/15c73260-2002-469a-8b4f-457a8253449f-kube-api-access-qmxpb\") pod \"limitador-operator-controller-manager-85c4996f8c-57q25\" (UID: \"15c73260-2002-469a-8b4f-457a8253449f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:30:38.806072 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.806042 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmxpb\" (UniqueName: \"kubernetes.io/projected/15c73260-2002-469a-8b4f-457a8253449f-kube-api-access-qmxpb\") pod \"limitador-operator-controller-manager-85c4996f8c-57q25\" (UID: \"15c73260-2002-469a-8b4f-457a8253449f\") " pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:30:38.888843 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.888825 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j4x5h\" (UniqueName: \"kubernetes.io/projected/7580a483-cb2c-453f-8375-268383cdad11-kube-api-access-j4x5h\") pod \"7580a483-cb2c-453f-8375-268383cdad11\" (UID: \"7580a483-cb2c-453f-8375-268383cdad11\") " Apr 20 19:30:38.890657 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.890632 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7580a483-cb2c-453f-8375-268383cdad11-kube-api-access-j4x5h" (OuterVolumeSpecName: "kube-api-access-j4x5h") pod "7580a483-cb2c-453f-8375-268383cdad11" (UID: "7580a483-cb2c-453f-8375-268383cdad11"). InnerVolumeSpecName "kube-api-access-j4x5h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:30:38.896775 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.896755 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:30:38.989601 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:38.989569 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j4x5h\" (UniqueName: \"kubernetes.io/projected/7580a483-cb2c-453f-8375-268383cdad11-kube-api-access-j4x5h\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:30:39.030848 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.030819 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25"] Apr 20 19:30:39.034370 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:30:39.034340 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15c73260_2002_469a_8b4f_457a8253449f.slice/crio-3add924c654a40b03b24bc6c500f09ac061f18066ec9a2c8152c3bc4266d164d WatchSource:0}: Error finding container 3add924c654a40b03b24bc6c500f09ac061f18066ec9a2c8152c3bc4266d164d: Status 404 returned error can't find the container with id 3add924c654a40b03b24bc6c500f09ac061f18066ec9a2c8152c3bc4266d164d Apr 20 19:30:39.656027 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.655969 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" event={"ID":"15c73260-2002-469a-8b4f-457a8253449f","Type":"ContainerStarted","Data":"031b3778a78ec62279bd3b76fd220d771dd71f11e2396f613470ac7648720ab8"} Apr 20 19:30:39.656027 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.656031 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" event={"ID":"15c73260-2002-469a-8b4f-457a8253449f","Type":"ContainerStarted","Data":"3add924c654a40b03b24bc6c500f09ac061f18066ec9a2c8152c3bc4266d164d"} Apr 20 19:30:39.656503 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.656093 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:30:39.657163 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.657147 2564 scope.go:117] "RemoveContainer" containerID="9bf97af2f1facf163804307f72018fff9931a69df5232affd2dc26c43654515a" Apr 20 19:30:39.657242 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.657175 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-rvsr5" Apr 20 19:30:39.679735 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:39.679696 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" podStartSLOduration=1.679685275 podStartE2EDuration="1.679685275s" podCreationTimestamp="2026-04-20 19:30:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:30:39.677781557 +0000 UTC m=+603.297463869" watchObservedRunningTime="2026-04-20 19:30:39.679685275 +0000 UTC m=+603.299367562" Apr 20 19:30:40.886123 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:40.886090 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7580a483-cb2c-453f-8375-268383cdad11" path="/var/lib/kubelet/pods/7580a483-cb2c-453f-8375-268383cdad11/volumes" Apr 20 19:30:50.664055 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:30:50.664017 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-85c4996f8c-57q25" Apr 20 19:31:20.688138 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.688104 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lkttm"] Apr 20 19:31:20.691378 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.691359 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:20.694050 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.694024 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-psz6s\"" Apr 20 19:31:20.698546 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.698523 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lkttm"] Apr 20 19:31:20.794235 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.794206 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g2dpv\" (UniqueName: \"kubernetes.io/projected/8182c804-0d5b-496f-83fe-7c368281ae0e-kube-api-access-g2dpv\") pod \"authorino-f99f4b5cd-lkttm\" (UID: \"8182c804-0d5b-496f-83fe-7c368281ae0e\") " pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:20.895329 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.895296 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g2dpv\" (UniqueName: \"kubernetes.io/projected/8182c804-0d5b-496f-83fe-7c368281ae0e-kube-api-access-g2dpv\") pod \"authorino-f99f4b5cd-lkttm\" (UID: \"8182c804-0d5b-496f-83fe-7c368281ae0e\") " pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:20.902731 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:20.902706 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g2dpv\" (UniqueName: \"kubernetes.io/projected/8182c804-0d5b-496f-83fe-7c368281ae0e-kube-api-access-g2dpv\") pod \"authorino-f99f4b5cd-lkttm\" (UID: \"8182c804-0d5b-496f-83fe-7c368281ae0e\") " pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:21.001957 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:21.001905 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:21.118163 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:21.118133 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lkttm"] Apr 20 19:31:21.121357 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:31:21.121311 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8182c804_0d5b_496f_83fe_7c368281ae0e.slice/crio-3a67ee43a45649e840a1e9e11a705702a52693ba05f1fa2b112ebc96561255e3 WatchSource:0}: Error finding container 3a67ee43a45649e840a1e9e11a705702a52693ba05f1fa2b112ebc96561255e3: Status 404 returned error can't find the container with id 3a67ee43a45649e840a1e9e11a705702a52693ba05f1fa2b112ebc96561255e3 Apr 20 19:31:21.797324 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:21.797272 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" event={"ID":"8182c804-0d5b-496f-83fe-7c368281ae0e","Type":"ContainerStarted","Data":"3a67ee43a45649e840a1e9e11a705702a52693ba05f1fa2b112ebc96561255e3"} Apr 20 19:31:24.810195 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:24.810150 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" event={"ID":"8182c804-0d5b-496f-83fe-7c368281ae0e","Type":"ContainerStarted","Data":"eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263"} Apr 20 19:31:24.824303 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:24.824245 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" podStartSLOduration=1.716461354 podStartE2EDuration="4.824226586s" podCreationTimestamp="2026-04-20 19:31:20 +0000 UTC" firstStartedPulling="2026-04-20 19:31:21.122851486 +0000 UTC m=+644.742533751" lastFinishedPulling="2026-04-20 19:31:24.230616716 +0000 UTC m=+647.850298983" observedRunningTime="2026-04-20 19:31:24.824188907 +0000 UTC m=+648.443871196" watchObservedRunningTime="2026-04-20 19:31:24.824226586 +0000 UTC m=+648.443908872" Apr 20 19:31:26.006475 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:26.006433 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lkttm"] Apr 20 19:31:26.816956 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:26.816917 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" podUID="8182c804-0d5b-496f-83fe-7c368281ae0e" containerName="authorino" containerID="cri-o://eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263" gracePeriod=30 Apr 20 19:31:27.059361 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.059339 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:27.141968 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.141906 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g2dpv\" (UniqueName: \"kubernetes.io/projected/8182c804-0d5b-496f-83fe-7c368281ae0e-kube-api-access-g2dpv\") pod \"8182c804-0d5b-496f-83fe-7c368281ae0e\" (UID: \"8182c804-0d5b-496f-83fe-7c368281ae0e\") " Apr 20 19:31:27.143798 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.143774 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8182c804-0d5b-496f-83fe-7c368281ae0e-kube-api-access-g2dpv" (OuterVolumeSpecName: "kube-api-access-g2dpv") pod "8182c804-0d5b-496f-83fe-7c368281ae0e" (UID: "8182c804-0d5b-496f-83fe-7c368281ae0e"). InnerVolumeSpecName "kube-api-access-g2dpv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:27.242552 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.242527 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g2dpv\" (UniqueName: \"kubernetes.io/projected/8182c804-0d5b-496f-83fe-7c368281ae0e-kube-api-access-g2dpv\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:31:27.822142 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.822104 2564 generic.go:358] "Generic (PLEG): container finished" podID="8182c804-0d5b-496f-83fe-7c368281ae0e" containerID="eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263" exitCode=0 Apr 20 19:31:27.822279 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.822162 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" Apr 20 19:31:27.822279 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.822191 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" event={"ID":"8182c804-0d5b-496f-83fe-7c368281ae0e","Type":"ContainerDied","Data":"eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263"} Apr 20 19:31:27.822279 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.822233 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-lkttm" event={"ID":"8182c804-0d5b-496f-83fe-7c368281ae0e","Type":"ContainerDied","Data":"3a67ee43a45649e840a1e9e11a705702a52693ba05f1fa2b112ebc96561255e3"} Apr 20 19:31:27.822279 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.822251 2564 scope.go:117] "RemoveContainer" containerID="eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263" Apr 20 19:31:27.831522 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.831498 2564 scope.go:117] "RemoveContainer" containerID="eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263" Apr 20 19:31:27.831834 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:31:27.831813 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263\": container with ID starting with eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263 not found: ID does not exist" containerID="eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263" Apr 20 19:31:27.831872 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.831845 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263"} err="failed to get container status \"eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263\": rpc error: code = NotFound desc = could not find container \"eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263\": container with ID starting with eca7621a2bf885ea3931e2c0f6284514a1d3c72ee60a717a7b33b2915b886263 not found: ID does not exist" Apr 20 19:31:27.843418 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.843388 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lkttm"] Apr 20 19:31:27.846251 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:27.846227 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-lkttm"] Apr 20 19:31:28.885427 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:28.885394 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8182c804-0d5b-496f-83fe-7c368281ae0e" path="/var/lib/kubelet/pods/8182c804-0d5b-496f-83fe-7c368281ae0e/volumes" Apr 20 19:31:42.801920 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.801884 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-q6dhq"] Apr 20 19:31:42.802479 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.802392 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8182c804-0d5b-496f-83fe-7c368281ae0e" containerName="authorino" Apr 20 19:31:42.802479 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.802411 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="8182c804-0d5b-496f-83fe-7c368281ae0e" containerName="authorino" Apr 20 19:31:42.802562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.802484 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="8182c804-0d5b-496f-83fe-7c368281ae0e" containerName="authorino" Apr 20 19:31:42.810183 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.810154 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:42.812801 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.812777 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-v7rsj\"" Apr 20 19:31:42.813030 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.812803 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 19:31:42.813773 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.813253 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-q6dhq"] Apr 20 19:31:42.849578 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.849549 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487c6d21-b807-4a61-ac2a-86b244e00818-data\") pod \"postgres-868db5846d-q6dhq\" (UID: \"487c6d21-b807-4a61-ac2a-86b244e00818\") " pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:42.849700 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.849586 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cfsj\" (UniqueName: \"kubernetes.io/projected/487c6d21-b807-4a61-ac2a-86b244e00818-kube-api-access-7cfsj\") pod \"postgres-868db5846d-q6dhq\" (UID: \"487c6d21-b807-4a61-ac2a-86b244e00818\") " pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:42.950389 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.950363 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cfsj\" (UniqueName: \"kubernetes.io/projected/487c6d21-b807-4a61-ac2a-86b244e00818-kube-api-access-7cfsj\") pod \"postgres-868db5846d-q6dhq\" (UID: \"487c6d21-b807-4a61-ac2a-86b244e00818\") " pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:42.950536 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.950465 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487c6d21-b807-4a61-ac2a-86b244e00818-data\") pod \"postgres-868db5846d-q6dhq\" (UID: \"487c6d21-b807-4a61-ac2a-86b244e00818\") " pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:42.950828 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.950811 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/487c6d21-b807-4a61-ac2a-86b244e00818-data\") pod \"postgres-868db5846d-q6dhq\" (UID: \"487c6d21-b807-4a61-ac2a-86b244e00818\") " pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:42.958811 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:42.958784 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cfsj\" (UniqueName: \"kubernetes.io/projected/487c6d21-b807-4a61-ac2a-86b244e00818-kube-api-access-7cfsj\") pod \"postgres-868db5846d-q6dhq\" (UID: \"487c6d21-b807-4a61-ac2a-86b244e00818\") " pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:43.124049 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:43.123959 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:43.239828 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:43.239801 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-q6dhq"] Apr 20 19:31:43.242713 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:31:43.242684 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487c6d21_b807_4a61_ac2a_86b244e00818.slice/crio-09218c45e8e91b3ee3dd0920e032e00c2654848a2d09bce3c891489a0e811a43 WatchSource:0}: Error finding container 09218c45e8e91b3ee3dd0920e032e00c2654848a2d09bce3c891489a0e811a43: Status 404 returned error can't find the container with id 09218c45e8e91b3ee3dd0920e032e00c2654848a2d09bce3c891489a0e811a43 Apr 20 19:31:43.877341 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:43.877301 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-q6dhq" event={"ID":"487c6d21-b807-4a61-ac2a-86b244e00818","Type":"ContainerStarted","Data":"09218c45e8e91b3ee3dd0920e032e00c2654848a2d09bce3c891489a0e811a43"} Apr 20 19:31:48.895407 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:48.895369 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-q6dhq" event={"ID":"487c6d21-b807-4a61-ac2a-86b244e00818","Type":"ContainerStarted","Data":"461d7e24a99d12c3165fca7e279813f048c3c48e3679626e7a51ce61c9b9d91a"} Apr 20 19:31:48.895843 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:48.895474 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:54.926901 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:54.926869 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-q6dhq" Apr 20 19:31:54.942854 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:54.942791 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-q6dhq" podStartSLOduration=8.024378537 podStartE2EDuration="12.942775978s" podCreationTimestamp="2026-04-20 19:31:42 +0000 UTC" firstStartedPulling="2026-04-20 19:31:43.244087536 +0000 UTC m=+666.863769801" lastFinishedPulling="2026-04-20 19:31:48.162484968 +0000 UTC m=+671.782167242" observedRunningTime="2026-04-20 19:31:48.909600616 +0000 UTC m=+672.529282903" watchObservedRunningTime="2026-04-20 19:31:54.942775978 +0000 UTC m=+678.562458264" Apr 20 19:31:55.775737 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.775698 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7b7f49b5fd-nwdld"] Apr 20 19:31:55.779229 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.779208 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:55.781647 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.781627 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-psz6s\"" Apr 20 19:31:55.784468 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.784443 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7b7f49b5fd-nwdld"] Apr 20 19:31:55.851577 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.851542 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b24rp\" (UniqueName: \"kubernetes.io/projected/50f01997-49ce-4d88-b486-2a15457e57c2-kube-api-access-b24rp\") pod \"authorino-7b7f49b5fd-nwdld\" (UID: \"50f01997-49ce-4d88-b486-2a15457e57c2\") " pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:55.952322 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.952284 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b24rp\" (UniqueName: \"kubernetes.io/projected/50f01997-49ce-4d88-b486-2a15457e57c2-kube-api-access-b24rp\") pod \"authorino-7b7f49b5fd-nwdld\" (UID: \"50f01997-49ce-4d88-b486-2a15457e57c2\") " pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:55.960889 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:55.960858 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b24rp\" (UniqueName: \"kubernetes.io/projected/50f01997-49ce-4d88-b486-2a15457e57c2-kube-api-access-b24rp\") pod \"authorino-7b7f49b5fd-nwdld\" (UID: \"50f01997-49ce-4d88-b486-2a15457e57c2\") " pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:56.089450 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.089418 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:56.090347 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.090012 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7b7f49b5fd-nwdld"] Apr 20 19:31:56.118471 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.118435 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-775b5c4d9d-6z6sq"] Apr 20 19:31:56.122721 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.122702 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.125214 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.125186 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 19:31:56.128845 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.128822 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-775b5c4d9d-6z6sq"] Apr 20 19:31:56.210445 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.210372 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7b7f49b5fd-nwdld"] Apr 20 19:31:56.213166 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:31:56.213132 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod50f01997_49ce_4d88_b486_2a15457e57c2.slice/crio-42a302a89dcb283ca0af87fcf98cbd6062ac0311e81af813504c6a88ea876669 WatchSource:0}: Error finding container 42a302a89dcb283ca0af87fcf98cbd6062ac0311e81af813504c6a88ea876669: Status 404 returned error can't find the container with id 42a302a89dcb283ca0af87fcf98cbd6062ac0311e81af813504c6a88ea876669 Apr 20 19:31:56.254791 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.254762 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q9rvf\" (UniqueName: \"kubernetes.io/projected/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-kube-api-access-q9rvf\") pod \"authorino-775b5c4d9d-6z6sq\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.254903 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.254808 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-tls-cert\") pod \"authorino-775b5c4d9d-6z6sq\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.355970 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.355912 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q9rvf\" (UniqueName: \"kubernetes.io/projected/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-kube-api-access-q9rvf\") pod \"authorino-775b5c4d9d-6z6sq\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.355970 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.355951 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-tls-cert\") pod \"authorino-775b5c4d9d-6z6sq\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.358256 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.358235 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-tls-cert\") pod \"authorino-775b5c4d9d-6z6sq\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.363240 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.363221 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q9rvf\" (UniqueName: \"kubernetes.io/projected/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-kube-api-access-q9rvf\") pod \"authorino-775b5c4d9d-6z6sq\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.433393 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.433371 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:31:56.550742 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.550714 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-775b5c4d9d-6z6sq"] Apr 20 19:31:56.553173 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:31:56.553144 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9a35f91c_3af7_4a77_a9c2_20aee010d9b3.slice/crio-ceaa2b46934ecac99a088c0d136e1cb1b4383f9947c2876dea7f739064f1e11e WatchSource:0}: Error finding container ceaa2b46934ecac99a088c0d136e1cb1b4383f9947c2876dea7f739064f1e11e: Status 404 returned error can't find the container with id ceaa2b46934ecac99a088c0d136e1cb1b4383f9947c2876dea7f739064f1e11e Apr 20 19:31:56.922167 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.922143 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" event={"ID":"9a35f91c-3af7-4a77-a9c2-20aee010d9b3","Type":"ContainerStarted","Data":"ceaa2b46934ecac99a088c0d136e1cb1b4383f9947c2876dea7f739064f1e11e"} Apr 20 19:31:56.923362 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.923341 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" event={"ID":"50f01997-49ce-4d88-b486-2a15457e57c2","Type":"ContainerStarted","Data":"1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0"} Apr 20 19:31:56.923462 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.923368 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" event={"ID":"50f01997-49ce-4d88-b486-2a15457e57c2","Type":"ContainerStarted","Data":"42a302a89dcb283ca0af87fcf98cbd6062ac0311e81af813504c6a88ea876669"} Apr 20 19:31:56.923519 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.923458 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" podUID="50f01997-49ce-4d88-b486-2a15457e57c2" containerName="authorino" containerID="cri-o://1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0" gracePeriod=30 Apr 20 19:31:56.941131 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:56.941089 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" podStartSLOduration=1.522505489 podStartE2EDuration="1.941075859s" podCreationTimestamp="2026-04-20 19:31:55 +0000 UTC" firstStartedPulling="2026-04-20 19:31:56.214456433 +0000 UTC m=+679.834138701" lastFinishedPulling="2026-04-20 19:31:56.633026806 +0000 UTC m=+680.252709071" observedRunningTime="2026-04-20 19:31:56.939527039 +0000 UTC m=+680.559209327" watchObservedRunningTime="2026-04-20 19:31:56.941075859 +0000 UTC m=+680.560758145" Apr 20 19:31:57.165508 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.165486 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:57.263235 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.263212 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b24rp\" (UniqueName: \"kubernetes.io/projected/50f01997-49ce-4d88-b486-2a15457e57c2-kube-api-access-b24rp\") pod \"50f01997-49ce-4d88-b486-2a15457e57c2\" (UID: \"50f01997-49ce-4d88-b486-2a15457e57c2\") " Apr 20 19:31:57.265160 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.265135 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/50f01997-49ce-4d88-b486-2a15457e57c2-kube-api-access-b24rp" (OuterVolumeSpecName: "kube-api-access-b24rp") pod "50f01997-49ce-4d88-b486-2a15457e57c2" (UID: "50f01997-49ce-4d88-b486-2a15457e57c2"). InnerVolumeSpecName "kube-api-access-b24rp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:31:57.364195 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.364168 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-b24rp\" (UniqueName: \"kubernetes.io/projected/50f01997-49ce-4d88-b486-2a15457e57c2-kube-api-access-b24rp\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:31:57.927806 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.927770 2564 generic.go:358] "Generic (PLEG): container finished" podID="50f01997-49ce-4d88-b486-2a15457e57c2" containerID="1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0" exitCode=0 Apr 20 19:31:57.927989 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.927817 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" Apr 20 19:31:57.927989 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.927851 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" event={"ID":"50f01997-49ce-4d88-b486-2a15457e57c2","Type":"ContainerDied","Data":"1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0"} Apr 20 19:31:57.927989 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.927885 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7b7f49b5fd-nwdld" event={"ID":"50f01997-49ce-4d88-b486-2a15457e57c2","Type":"ContainerDied","Data":"42a302a89dcb283ca0af87fcf98cbd6062ac0311e81af813504c6a88ea876669"} Apr 20 19:31:57.927989 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.927902 2564 scope.go:117] "RemoveContainer" containerID="1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0" Apr 20 19:31:57.929324 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.929299 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" event={"ID":"9a35f91c-3af7-4a77-a9c2-20aee010d9b3","Type":"ContainerStarted","Data":"d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e"} Apr 20 19:31:57.936114 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.936096 2564 scope.go:117] "RemoveContainer" containerID="1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0" Apr 20 19:31:57.936369 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:31:57.936352 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0\": container with ID starting with 1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0 not found: ID does not exist" containerID="1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0" Apr 20 19:31:57.936416 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.936376 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0"} err="failed to get container status \"1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0\": rpc error: code = NotFound desc = could not find container \"1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0\": container with ID starting with 1fbdc9e082e84df5346febd8fb97542908fa92f8cdfd79cc484de8885b80b2e0 not found: ID does not exist" Apr 20 19:31:57.947600 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.947544 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" podStartSLOduration=1.6351380309999999 podStartE2EDuration="1.947530263s" podCreationTimestamp="2026-04-20 19:31:56 +0000 UTC" firstStartedPulling="2026-04-20 19:31:56.55446928 +0000 UTC m=+680.174151545" lastFinishedPulling="2026-04-20 19:31:56.866861509 +0000 UTC m=+680.486543777" observedRunningTime="2026-04-20 19:31:57.945390284 +0000 UTC m=+681.565072570" watchObservedRunningTime="2026-04-20 19:31:57.947530263 +0000 UTC m=+681.567212551" Apr 20 19:31:57.965200 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.963487 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7b7f49b5fd-nwdld"] Apr 20 19:31:57.967808 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:57.967393 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7b7f49b5fd-nwdld"] Apr 20 19:31:58.885973 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:31:58.885940 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="50f01997-49ce-4d88-b486-2a15457e57c2" path="/var/lib/kubelet/pods/50f01997-49ce-4d88-b486-2a15457e57c2/volumes" Apr 20 19:33:02.579396 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.579355 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr"] Apr 20 19:33:02.579783 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.579678 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="50f01997-49ce-4d88-b486-2a15457e57c2" containerName="authorino" Apr 20 19:33:02.579783 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.579689 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="50f01997-49ce-4d88-b486-2a15457e57c2" containerName="authorino" Apr 20 19:33:02.579783 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.579764 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="50f01997-49ce-4d88-b486-2a15457e57c2" containerName="authorino" Apr 20 19:33:02.582946 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.582925 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.585466 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.585443 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 19:33:02.585572 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.585488 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 19:33:02.586461 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.586445 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-9dkrq\"" Apr 20 19:33:02.586527 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.586450 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 19:33:02.593715 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.593694 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr"] Apr 20 19:33:02.649168 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.649138 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.649168 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.649169 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.649330 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.649189 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.649330 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.649218 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c258eebe-2258-45d6-ad25-1081afc3434e-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.649330 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.649259 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.649330 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.649302 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rstxf\" (UniqueName: \"kubernetes.io/projected/c258eebe-2258-45d6-ad25-1081afc3434e-kube-api-access-rstxf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.749748 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.749719 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.749748 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.749750 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.749921 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.749769 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.749921 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.749790 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c258eebe-2258-45d6-ad25-1081afc3434e-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.749921 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.749818 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.749921 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.749866 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rstxf\" (UniqueName: \"kubernetes.io/projected/c258eebe-2258-45d6-ad25-1081afc3434e-kube-api-access-rstxf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.750190 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.750174 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.750331 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.750308 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.750479 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.750369 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.752378 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.752357 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c258eebe-2258-45d6-ad25-1081afc3434e-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.752766 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.752747 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c258eebe-2258-45d6-ad25-1081afc3434e-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.757742 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.757720 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rstxf\" (UniqueName: \"kubernetes.io/projected/c258eebe-2258-45d6-ad25-1081afc3434e-kube-api-access-rstxf\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr\" (UID: \"c258eebe-2258-45d6-ad25-1081afc3434e\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:02.893185 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:02.893129 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:03.015035 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:03.014984 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr"] Apr 20 19:33:03.018358 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:33:03.018324 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc258eebe_2258_45d6_ad25_1081afc3434e.slice/crio-8dfc545e6baa9bde9eae18986d53f382518c8bf5f7738b5ac622ecf8e4f4a17e WatchSource:0}: Error finding container 8dfc545e6baa9bde9eae18986d53f382518c8bf5f7738b5ac622ecf8e4f4a17e: Status 404 returned error can't find the container with id 8dfc545e6baa9bde9eae18986d53f382518c8bf5f7738b5ac622ecf8e4f4a17e Apr 20 19:33:03.020087 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:03.020068 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:33:03.150518 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:03.150437 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" event={"ID":"c258eebe-2258-45d6-ad25-1081afc3434e","Type":"ContainerStarted","Data":"8dfc545e6baa9bde9eae18986d53f382518c8bf5f7738b5ac622ecf8e4f4a17e"} Apr 20 19:33:08.172850 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:08.172817 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" event={"ID":"c258eebe-2258-45d6-ad25-1081afc3434e","Type":"ContainerStarted","Data":"7a6515565373db00d05b55abc5df2bb051c9639fcedd83aef71022cd80f1fcf1"} Apr 20 19:33:14.193805 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:14.193769 2564 generic.go:358] "Generic (PLEG): container finished" podID="c258eebe-2258-45d6-ad25-1081afc3434e" containerID="7a6515565373db00d05b55abc5df2bb051c9639fcedd83aef71022cd80f1fcf1" exitCode=0 Apr 20 19:33:14.194169 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:14.193840 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" event={"ID":"c258eebe-2258-45d6-ad25-1081afc3434e","Type":"ContainerDied","Data":"7a6515565373db00d05b55abc5df2bb051c9639fcedd83aef71022cd80f1fcf1"} Apr 20 19:33:16.201529 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:16.201495 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" event={"ID":"c258eebe-2258-45d6-ad25-1081afc3434e","Type":"ContainerStarted","Data":"8e2dfa99265c43ff0ecfd76b6ea1b58444fd0551c0836fa285f1e01988e2a1ba"} Apr 20 19:33:16.201889 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:16.201704 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:33:16.220648 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:16.220604 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" podStartSLOduration=1.961456855 podStartE2EDuration="14.220589171s" podCreationTimestamp="2026-04-20 19:33:02 +0000 UTC" firstStartedPulling="2026-04-20 19:33:03.020196114 +0000 UTC m=+746.639878380" lastFinishedPulling="2026-04-20 19:33:15.279328429 +0000 UTC m=+758.899010696" observedRunningTime="2026-04-20 19:33:16.219260419 +0000 UTC m=+759.838942719" watchObservedRunningTime="2026-04-20 19:33:16.220589171 +0000 UTC m=+759.840271461" Apr 20 19:33:27.217623 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:33:27.217590 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr" Apr 20 19:34:10.167848 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.167754 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6"] Apr 20 19:34:10.171373 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.171351 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.173929 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.173903 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 20 19:34:10.181513 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.181492 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6"] Apr 20 19:34:10.266359 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.266327 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.266500 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.266366 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.266500 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.266455 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.266612 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.266501 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.266612 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.266556 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-np95m\" (UniqueName: \"kubernetes.io/projected/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-kube-api-access-np95m\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.266612 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.266594 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367282 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367253 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367438 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367287 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367438 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367307 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-np95m\" (UniqueName: \"kubernetes.io/projected/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-kube-api-access-np95m\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367438 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367324 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367605 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367453 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367605 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367508 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367704 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367671 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367704 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367678 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.367852 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.367832 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.369488 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.369463 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.369761 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.369745 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.375128 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.375106 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-np95m\" (UniqueName: \"kubernetes.io/projected/f8734a13-7145-4cb7-8d66-f37f8c5b86e0-kube-api-access-np95m\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-tlrm6\" (UID: \"f8734a13-7145-4cb7-8d66-f37f8c5b86e0\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.481976 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.481920 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:10.604959 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:10.604934 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6"] Apr 20 19:34:10.607022 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:34:10.606980 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8734a13_7145_4cb7_8d66_f37f8c5b86e0.slice/crio-04122d6037bc588482408f103e2f8dba9f7282f9626816bb112697d9750fe4ed WatchSource:0}: Error finding container 04122d6037bc588482408f103e2f8dba9f7282f9626816bb112697d9750fe4ed: Status 404 returned error can't find the container with id 04122d6037bc588482408f103e2f8dba9f7282f9626816bb112697d9750fe4ed Apr 20 19:34:11.386664 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:11.386624 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" event={"ID":"f8734a13-7145-4cb7-8d66-f37f8c5b86e0","Type":"ContainerStarted","Data":"085247c683d94f9c3a80c4179da2cea90f677e20394e1f7fb8e6ea5ed588097e"} Apr 20 19:34:11.387043 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:11.386669 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" event={"ID":"f8734a13-7145-4cb7-8d66-f37f8c5b86e0","Type":"ContainerStarted","Data":"04122d6037bc588482408f103e2f8dba9f7282f9626816bb112697d9750fe4ed"} Apr 20 19:34:16.406037 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:16.406004 2564 generic.go:358] "Generic (PLEG): container finished" podID="f8734a13-7145-4cb7-8d66-f37f8c5b86e0" containerID="085247c683d94f9c3a80c4179da2cea90f677e20394e1f7fb8e6ea5ed588097e" exitCode=0 Apr 20 19:34:16.406037 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:16.406018 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" event={"ID":"f8734a13-7145-4cb7-8d66-f37f8c5b86e0","Type":"ContainerDied","Data":"085247c683d94f9c3a80c4179da2cea90f677e20394e1f7fb8e6ea5ed588097e"} Apr 20 19:34:17.411185 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:17.411149 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" event={"ID":"f8734a13-7145-4cb7-8d66-f37f8c5b86e0","Type":"ContainerStarted","Data":"75b637ca176311c1f541a8f9b8e26e3abdb6629516be4dd24361835366bfa5c5"} Apr 20 19:34:17.411571 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:17.411364 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:17.430594 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:17.430546 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" podStartSLOduration=7.218266857 podStartE2EDuration="7.430530326s" podCreationTimestamp="2026-04-20 19:34:10 +0000 UTC" firstStartedPulling="2026-04-20 19:34:16.406586821 +0000 UTC m=+820.026269086" lastFinishedPulling="2026-04-20 19:34:16.61885029 +0000 UTC m=+820.238532555" observedRunningTime="2026-04-20 19:34:17.428537335 +0000 UTC m=+821.048219623" watchObservedRunningTime="2026-04-20 19:34:17.430530326 +0000 UTC m=+821.050212613" Apr 20 19:34:28.427364 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:28.427334 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-tlrm6" Apr 20 19:34:38.839113 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:38.839078 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-775b5c4d9d-6z6sq"] Apr 20 19:34:38.839531 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:38.839280 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" podUID="9a35f91c-3af7-4a77-a9c2-20aee010d9b3" containerName="authorino" containerID="cri-o://d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e" gracePeriod=30 Apr 20 19:34:39.091558 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.091502 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:34:39.196749 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.196723 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q9rvf\" (UniqueName: \"kubernetes.io/projected/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-kube-api-access-q9rvf\") pod \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " Apr 20 19:34:39.196866 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.196782 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-tls-cert\") pod \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\" (UID: \"9a35f91c-3af7-4a77-a9c2-20aee010d9b3\") " Apr 20 19:34:39.198787 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.198758 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-kube-api-access-q9rvf" (OuterVolumeSpecName: "kube-api-access-q9rvf") pod "9a35f91c-3af7-4a77-a9c2-20aee010d9b3" (UID: "9a35f91c-3af7-4a77-a9c2-20aee010d9b3"). InnerVolumeSpecName "kube-api-access-q9rvf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:39.207919 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.207893 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "9a35f91c-3af7-4a77-a9c2-20aee010d9b3" (UID: "9a35f91c-3af7-4a77-a9c2-20aee010d9b3"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:34:39.297897 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.297865 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-q9rvf\" (UniqueName: \"kubernetes.io/projected/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-kube-api-access-q9rvf\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:34:39.297897 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.297893 2564 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/9a35f91c-3af7-4a77-a9c2-20aee010d9b3-tls-cert\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:34:39.485412 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.485324 2564 generic.go:358] "Generic (PLEG): container finished" podID="9a35f91c-3af7-4a77-a9c2-20aee010d9b3" containerID="d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e" exitCode=0 Apr 20 19:34:39.485412 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.485375 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" Apr 20 19:34:39.485412 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.485385 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" event={"ID":"9a35f91c-3af7-4a77-a9c2-20aee010d9b3","Type":"ContainerDied","Data":"d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e"} Apr 20 19:34:39.485632 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.485420 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-775b5c4d9d-6z6sq" event={"ID":"9a35f91c-3af7-4a77-a9c2-20aee010d9b3","Type":"ContainerDied","Data":"ceaa2b46934ecac99a088c0d136e1cb1b4383f9947c2876dea7f739064f1e11e"} Apr 20 19:34:39.485632 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.485440 2564 scope.go:117] "RemoveContainer" containerID="d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e" Apr 20 19:34:39.495503 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.495480 2564 scope.go:117] "RemoveContainer" containerID="d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e" Apr 20 19:34:39.495937 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:34:39.495912 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e\": container with ID starting with d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e not found: ID does not exist" containerID="d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e" Apr 20 19:34:39.496053 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.495944 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e"} err="failed to get container status \"d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e\": rpc error: code = NotFound desc = could not find container \"d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e\": container with ID starting with d4334ef3aaa6b411062cd56abcff38224855601b72415ff976d4c580bea9cd3e not found: ID does not exist" Apr 20 19:34:39.508272 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.508246 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-775b5c4d9d-6z6sq"] Apr 20 19:34:39.512759 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:39.512740 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-775b5c4d9d-6z6sq"] Apr 20 19:34:40.891050 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:34:40.890989 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a35f91c-3af7-4a77-a9c2-20aee010d9b3" path="/var/lib/kubelet/pods/9a35f91c-3af7-4a77-a9c2-20aee010d9b3/volumes" Apr 20 19:35:36.811494 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:35:36.811420 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:35:36.813353 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:35:36.813323 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:40:36.833197 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:40:36.833166 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:40:36.836475 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:40:36.836454 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:45:00.142700 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.142670 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29611905-gv6m8"] Apr 20 19:45:00.144983 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.143005 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a35f91c-3af7-4a77-a9c2-20aee010d9b3" containerName="authorino" Apr 20 19:45:00.144983 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.143017 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a35f91c-3af7-4a77-a9c2-20aee010d9b3" containerName="authorino" Apr 20 19:45:00.144983 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.143077 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a35f91c-3af7-4a77-a9c2-20aee010d9b3" containerName="authorino" Apr 20 19:45:00.145795 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.145779 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:45:00.148198 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.148173 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-ssfr9\"" Apr 20 19:45:00.159853 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.159828 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611905-gv6m8"] Apr 20 19:45:00.298634 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.298606 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk8q7\" (UniqueName: \"kubernetes.io/projected/cb021f40-fc6e-44c8-b241-5680d703830a-kube-api-access-wk8q7\") pod \"maas-api-key-cleanup-29611905-gv6m8\" (UID: \"cb021f40-fc6e-44c8-b241-5680d703830a\") " pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:45:00.399736 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.399675 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wk8q7\" (UniqueName: \"kubernetes.io/projected/cb021f40-fc6e-44c8-b241-5680d703830a-kube-api-access-wk8q7\") pod \"maas-api-key-cleanup-29611905-gv6m8\" (UID: \"cb021f40-fc6e-44c8-b241-5680d703830a\") " pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:45:00.407393 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.407372 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk8q7\" (UniqueName: \"kubernetes.io/projected/cb021f40-fc6e-44c8-b241-5680d703830a-kube-api-access-wk8q7\") pod \"maas-api-key-cleanup-29611905-gv6m8\" (UID: \"cb021f40-fc6e-44c8-b241-5680d703830a\") " pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:45:00.455721 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.455686 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:45:00.574448 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.574421 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611905-gv6m8"] Apr 20 19:45:00.577232 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:45:00.577200 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb021f40_fc6e_44c8_b241_5680d703830a.slice/crio-742a1772c9e8668d6206a101370a1774a13778387d127667b3d6ec40548283dd WatchSource:0}: Error finding container 742a1772c9e8668d6206a101370a1774a13778387d127667b3d6ec40548283dd: Status 404 returned error can't find the container with id 742a1772c9e8668d6206a101370a1774a13778387d127667b3d6ec40548283dd Apr 20 19:45:00.579366 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:00.579348 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:45:01.549690 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:01.549654 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerStarted","Data":"742a1772c9e8668d6206a101370a1774a13778387d127667b3d6ec40548283dd"} Apr 20 19:45:03.558589 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:03.558552 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerStarted","Data":"2bd59db16c5a4ffbe654500471d8d95b7afbed0adc315cf8e2b4aa958a21f8c9"} Apr 20 19:45:03.578761 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:03.578708 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" podStartSLOduration=1.6006725579999999 podStartE2EDuration="3.578694459s" podCreationTimestamp="2026-04-20 19:45:00 +0000 UTC" firstStartedPulling="2026-04-20 19:45:00.579473118 +0000 UTC m=+1464.199155383" lastFinishedPulling="2026-04-20 19:45:02.557495018 +0000 UTC m=+1466.177177284" observedRunningTime="2026-04-20 19:45:03.576466 +0000 UTC m=+1467.196148289" watchObservedRunningTime="2026-04-20 19:45:03.578694459 +0000 UTC m=+1467.198376746" Apr 20 19:45:23.630607 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:23.630567 2564 generic.go:358] "Generic (PLEG): container finished" podID="cb021f40-fc6e-44c8-b241-5680d703830a" containerID="2bd59db16c5a4ffbe654500471d8d95b7afbed0adc315cf8e2b4aa958a21f8c9" exitCode=6 Apr 20 19:45:23.630972 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:23.630643 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerDied","Data":"2bd59db16c5a4ffbe654500471d8d95b7afbed0adc315cf8e2b4aa958a21f8c9"} Apr 20 19:45:23.630972 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:23.630921 2564 scope.go:117] "RemoveContainer" containerID="2bd59db16c5a4ffbe654500471d8d95b7afbed0adc315cf8e2b4aa958a21f8c9" Apr 20 19:45:24.634940 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:24.634903 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerStarted","Data":"7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085"} Apr 20 19:45:36.858431 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:36.858402 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:45:36.863749 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:36.863728 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:45:44.701500 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:44.701461 2564 generic.go:358] "Generic (PLEG): container finished" podID="cb021f40-fc6e-44c8-b241-5680d703830a" containerID="7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085" exitCode=6 Apr 20 19:45:44.701902 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:44.701510 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerDied","Data":"7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085"} Apr 20 19:45:44.701902 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:44.701546 2564 scope.go:117] "RemoveContainer" containerID="2bd59db16c5a4ffbe654500471d8d95b7afbed0adc315cf8e2b4aa958a21f8c9" Apr 20 19:45:44.701902 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:44.701871 2564 scope.go:117] "RemoveContainer" containerID="7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085" Apr 20 19:45:44.702126 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:45:44.702104 2564 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29611905-gv6m8_opendatahub(cb021f40-fc6e-44c8-b241-5680d703830a)\"" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" Apr 20 19:45:57.881520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:57.881441 2564 scope.go:117] "RemoveContainer" containerID="7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085" Apr 20 19:45:58.749982 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:58.749946 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerStarted","Data":"e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea"} Apr 20 19:45:58.907066 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:58.907037 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611905-gv6m8"] Apr 20 19:45:59.753324 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:45:59.753276 2564 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" containerID="cri-o://e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea" gracePeriod=30 Apr 20 19:46:18.591555 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.591526 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:46:18.710239 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.710165 2564 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk8q7\" (UniqueName: \"kubernetes.io/projected/cb021f40-fc6e-44c8-b241-5680d703830a-kube-api-access-wk8q7\") pod \"cb021f40-fc6e-44c8-b241-5680d703830a\" (UID: \"cb021f40-fc6e-44c8-b241-5680d703830a\") " Apr 20 19:46:18.712144 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.712121 2564 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb021f40-fc6e-44c8-b241-5680d703830a-kube-api-access-wk8q7" (OuterVolumeSpecName: "kube-api-access-wk8q7") pod "cb021f40-fc6e-44c8-b241-5680d703830a" (UID: "cb021f40-fc6e-44c8-b241-5680d703830a"). InnerVolumeSpecName "kube-api-access-wk8q7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:46:18.811690 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.811663 2564 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wk8q7\" (UniqueName: \"kubernetes.io/projected/cb021f40-fc6e-44c8-b241-5680d703830a-kube-api-access-wk8q7\") on node \"ip-10-0-129-98.ec2.internal\" DevicePath \"\"" Apr 20 19:46:18.817883 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.817858 2564 generic.go:358] "Generic (PLEG): container finished" podID="cb021f40-fc6e-44c8-b241-5680d703830a" containerID="e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea" exitCode=6 Apr 20 19:46:18.817979 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.817890 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerDied","Data":"e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea"} Apr 20 19:46:18.817979 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.817912 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" event={"ID":"cb021f40-fc6e-44c8-b241-5680d703830a","Type":"ContainerDied","Data":"742a1772c9e8668d6206a101370a1774a13778387d127667b3d6ec40548283dd"} Apr 20 19:46:18.817979 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.817932 2564 scope.go:117] "RemoveContainer" containerID="e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea" Apr 20 19:46:18.817979 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.817941 2564 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29611905-gv6m8" Apr 20 19:46:18.830600 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.830579 2564 scope.go:117] "RemoveContainer" containerID="7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085" Apr 20 19:46:18.837460 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.837443 2564 scope.go:117] "RemoveContainer" containerID="e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea" Apr 20 19:46:18.837712 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:46:18.837693 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea\": container with ID starting with e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea not found: ID does not exist" containerID="e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea" Apr 20 19:46:18.837750 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.837721 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea"} err="failed to get container status \"e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea\": rpc error: code = NotFound desc = could not find container \"e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea\": container with ID starting with e1a222864b801eedd56b3223d9e4147c3de6069e008730aecfe9246b481818ea not found: ID does not exist" Apr 20 19:46:18.837750 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.837737 2564 scope.go:117] "RemoveContainer" containerID="7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085" Apr 20 19:46:18.837962 ip-10-0-129-98 kubenswrapper[2564]: E0420 19:46:18.837943 2564 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085\": container with ID starting with 7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085 not found: ID does not exist" containerID="7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085" Apr 20 19:46:18.838012 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.837969 2564 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085"} err="failed to get container status \"7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085\": rpc error: code = NotFound desc = could not find container \"7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085\": container with ID starting with 7de7a19592d047b54ae01df6a9434e52097e8b9ab8adc102ef369b1aaeb28085 not found: ID does not exist" Apr 20 19:46:18.845064 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.845035 2564 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611905-gv6m8"] Apr 20 19:46:18.848950 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.848930 2564 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29611905-gv6m8"] Apr 20 19:46:18.885801 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:46:18.885774 2564 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" path="/var/lib/kubelet/pods/cb021f40-fc6e-44c8-b241-5680d703830a/volumes" Apr 20 19:50:36.879242 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:50:36.879216 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:50:36.885600 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:50:36.885573 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:55:36.899686 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:55:36.899657 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:55:36.906106 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:55:36.906087 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:57:08.175520 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:08.175488 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-mtf2s_9b6fe48f-a6fd-414a-bc81-d38839becb12/manager/0.log" Apr 20 19:57:08.567813 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:08.567786 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-pxsf7_6f1ce413-5690-4f7b-b12c-bf1b2d1805be/manager/2.log" Apr 20 19:57:08.815900 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:08.815866 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7875d57869-vt2rf_bbfc9f71-287b-42f9-a950-0903e2b4cd98/manager/0.log" Apr 20 19:57:09.057616 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:09.057592 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-q6dhq_487c6d21-b807-4a61-ac2a-86b244e00818/postgres/0.log" Apr 20 19:57:10.482020 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:10.481969 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-jq4p7_fba780a4-2deb-431d-8bad-3c9f2c0a3b5c/manager/0.log" Apr 20 19:57:10.598417 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:10.598368 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-kxn6j_304440ae-0748-4cd1-864a-2fecc74a4a70/manager/0.log" Apr 20 19:57:11.205297 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:11.205266 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-57q25_15c73260-2002-469a-8b4f-457a8253449f/manager/0.log" Apr 20 19:57:11.687895 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:11.687866 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-xv4n8_4afa2c06-bd15-4b40-8ad0-bf2401e8b782/discovery/0.log" Apr 20 19:57:11.913519 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:11.913482 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-65b68d668c-qqcmw_b637c838-b115-4a7f-8513-b59711277667/kube-auth-proxy/0.log" Apr 20 19:57:12.514335 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:12.514309 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr_c258eebe-2258-45d6-ad25-1081afc3434e/storage-initializer/0.log" Apr 20 19:57:12.521883 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:12.521863 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-lszmr_c258eebe-2258-45d6-ad25-1081afc3434e/main/0.log" Apr 20 19:57:12.755873 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:12.755846 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-tlrm6_f8734a13-7145-4cb7-8d66-f37f8c5b86e0/main/0.log" Apr 20 19:57:12.764912 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:12.764830 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-tlrm6_f8734a13-7145-4cb7-8d66-f37f8c5b86e0/storage-initializer/0.log" Apr 20 19:57:20.741859 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:20.741811 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xszmt_3875e401-7a70-4f30-84ea-3a7119c9272a/global-pull-secret-syncer/0.log" Apr 20 19:57:20.814881 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:20.814844 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-nqj8h_e6858c48-14e7-4cc5-a1cd-0554a465f2db/konnectivity-agent/0.log" Apr 20 19:57:20.862945 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:20.862919 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-129-98.ec2.internal_87dc53c55f73620bf5df44e2826c141e/haproxy/0.log" Apr 20 19:57:25.977055 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:25.977015 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-jq4p7_fba780a4-2deb-431d-8bad-3c9f2c0a3b5c/manager/0.log" Apr 20 19:57:26.000778 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:26.000757 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-kxn6j_304440ae-0748-4cd1-864a-2fecc74a4a70/manager/0.log" Apr 20 19:57:26.230433 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:26.230344 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-85c4996f8c-57q25_15c73260-2002-469a-8b4f-457a8253449f/manager/0.log" Apr 20 19:57:28.053897 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:28.053856 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7d465989d-274nv_15a24773-49ae-4109-a012-83724a1e5f19/metrics-server/0.log" Apr 20 19:57:28.102505 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:28.102477 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmzfb_724ddbfe-066b-4314-9b9b-de1647c2747b/node-exporter/0.log" Apr 20 19:57:28.120468 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:28.120442 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmzfb_724ddbfe-066b-4314-9b9b-de1647c2747b/kube-rbac-proxy/0.log" Apr 20 19:57:28.139314 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:28.139293 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmzfb_724ddbfe-066b-4314-9b9b-de1647c2747b/init-textfile/0.log" Apr 20 19:57:28.514531 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:28.514461 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m9nwj_15d7f65a-133b-4803-9e61-e6081374a2ef/prometheus-operator/0.log" Apr 20 19:57:28.530398 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:28.530376 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-m9nwj_15d7f65a-133b-4803-9e61-e6081374a2ef/kube-rbac-proxy/0.log" Apr 20 19:57:29.556171 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556136 2564 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr"] Apr 20 19:57:29.556625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556535 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556550 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556563 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556572 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556585 2564 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556625 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556594 2564 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556902 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556678 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.556902 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.556695 2564 memory_manager.go:356] "RemoveStaleState removing state" podUID="cb021f40-fc6e-44c8-b241-5680d703830a" containerName="cleanup" Apr 20 19:57:29.559730 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.559711 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.562402 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.562380 2564 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-r5pdh\"/\"default-dockercfg-cwh22\"" Apr 20 19:57:29.562518 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.562419 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5pdh\"/\"openshift-service-ca.crt\"" Apr 20 19:57:29.563310 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.563289 2564 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-r5pdh\"/\"kube-root-ca.crt\"" Apr 20 19:57:29.567940 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.567913 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr"] Apr 20 19:57:29.729893 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.729861 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-podres\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.730044 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.729911 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-proc\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.730044 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.729986 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-sys\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.730151 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.730062 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-lib-modules\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.730151 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.730119 2564 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-srktk\" (UniqueName: \"kubernetes.io/projected/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-kube-api-access-srktk\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831402 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831340 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-lib-modules\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831402 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831378 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-srktk\" (UniqueName: \"kubernetes.io/projected/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-kube-api-access-srktk\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831407 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-podres\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831445 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-proc\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831485 2564 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-sys\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831562 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831527 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-lib-modules\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831701 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831568 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-proc\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831701 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831583 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-sys\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.831701 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.831565 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-podres\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.839720 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.839694 2564 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-srktk\" (UniqueName: \"kubernetes.io/projected/448c7cae-d42d-401e-ab8d-de5eaec8cd9e-kube-api-access-srktk\") pod \"perf-node-gather-daemonset-hqrsr\" (UID: \"448c7cae-d42d-401e-ab8d-de5eaec8cd9e\") " pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.870775 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.870749 2564 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:29.989855 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.989826 2564 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr"] Apr 20 19:57:29.992941 ip-10-0-129-98 kubenswrapper[2564]: W0420 19:57:29.992917 2564 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod448c7cae_d42d_401e_ab8d_de5eaec8cd9e.slice/crio-b61292bab4b38afc1c5f29fbdaf96a9f07a000de850a9277762409881591bd3e WatchSource:0}: Error finding container b61292bab4b38afc1c5f29fbdaf96a9f07a000de850a9277762409881591bd3e: Status 404 returned error can't find the container with id b61292bab4b38afc1c5f29fbdaf96a9f07a000de850a9277762409881591bd3e Apr 20 19:57:29.995053 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:29.995036 2564 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:57:30.020519 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:30.020498 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" event={"ID":"448c7cae-d42d-401e-ab8d-de5eaec8cd9e","Type":"ContainerStarted","Data":"b61292bab4b38afc1c5f29fbdaf96a9f07a000de850a9277762409881591bd3e"} Apr 20 19:57:31.024634 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:31.024597 2564 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" event={"ID":"448c7cae-d42d-401e-ab8d-de5eaec8cd9e","Type":"ContainerStarted","Data":"d4e0d4e5dd32224910e443e1457d26b204ecdee58bf174db020d851c5d35f65f"} Apr 20 19:57:31.025060 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:31.024751 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:31.039525 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:31.039479 2564 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" podStartSLOduration=2.039467051 podStartE2EDuration="2.039467051s" podCreationTimestamp="2026-04-20 19:57:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:57:31.038488505 +0000 UTC m=+2214.658170793" watchObservedRunningTime="2026-04-20 19:57:31.039467051 +0000 UTC m=+2214.659149339" Apr 20 19:57:32.055285 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:32.055256 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ps5gm_5759257a-ffe8-4341-a52c-735c321d9f4a/dns/0.log" Apr 20 19:57:32.074876 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:32.074855 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-ps5gm_5759257a-ffe8-4341-a52c-735c321d9f4a/kube-rbac-proxy/0.log" Apr 20 19:57:32.139487 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:32.139459 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-6rnf5_a09cd02b-dfa8-4f51-abdc-9e5a0b219e23/dns-node-resolver/0.log" Apr 20 19:57:32.647383 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:32.647352 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-5c7r9_f57b12eb-90dc-43f3-b677-c16555487307/node-ca/0.log" Apr 20 19:57:33.595659 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:33.595633 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-xv4n8_4afa2c06-bd15-4b40-8ad0-bf2401e8b782/discovery/0.log" Apr 20 19:57:33.665608 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:33.665582 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-65b68d668c-qqcmw_b637c838-b115-4a7f-8513-b59711277667/kube-auth-proxy/0.log" Apr 20 19:57:34.310404 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:34.310370 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-8m7qk_5ca54d9a-af5f-4f4f-b135-bfc6c5824e75/serve-healthcheck-canary/0.log" Apr 20 19:57:34.815516 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:34.815484 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-52pdc_5e519934-7cb4-45a9-9e43-0ada33d7fc5c/kube-rbac-proxy/0.log" Apr 20 19:57:34.833836 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:34.833802 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-52pdc_5e519934-7cb4-45a9-9e43-0ada33d7fc5c/exporter/0.log" Apr 20 19:57:34.851594 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:34.851566 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-52pdc_5e519934-7cb4-45a9-9e43-0ada33d7fc5c/extractor/0.log" Apr 20 19:57:36.858621 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:36.858590 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-mtf2s_9b6fe48f-a6fd-414a-bc81-d38839becb12/manager/0.log" Apr 20 19:57:36.991525 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:36.991495 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-pxsf7_6f1ce413-5690-4f7b-b12c-bf1b2d1805be/manager/1.log" Apr 20 19:57:37.000686 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:37.000667 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-pxsf7_6f1ce413-5690-4f7b-b12c-bf1b2d1805be/manager/2.log" Apr 20 19:57:37.036900 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:37.036880 2564 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-r5pdh/perf-node-gather-daemonset-hqrsr" Apr 20 19:57:37.065527 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:37.065507 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-7875d57869-vt2rf_bbfc9f71-287b-42f9-a950-0903e2b4cd98/manager/0.log" Apr 20 19:57:37.110950 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:37.110904 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-q6dhq_487c6d21-b807-4a61-ac2a-86b244e00818/postgres/0.log" Apr 20 19:57:38.383291 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:38.383219 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-hcfv5_a1e35ae5-5204-49e5-9d53-789b656366d2/openshift-lws-operator/0.log" Apr 20 19:57:44.249454 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.249419 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-62ns8_69310308-59c8-4043-9117-c0e3a4104e6e/kube-multus/0.log" Apr 20 19:57:44.271081 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.271060 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/kube-multus-additional-cni-plugins/0.log" Apr 20 19:57:44.290005 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.289981 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/egress-router-binary-copy/0.log" Apr 20 19:57:44.308901 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.308876 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/cni-plugins/0.log" Apr 20 19:57:44.327138 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.327117 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/bond-cni-plugin/0.log" Apr 20 19:57:44.346160 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.346137 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/routeoverride-cni/0.log" Apr 20 19:57:44.363362 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.363342 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/whereabouts-cni-bincopy/0.log" Apr 20 19:57:44.381427 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.381406 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-7stt2_f331ebff-9be1-4254-b31d-7bcbbc5bbf98/whereabouts-cni/0.log" Apr 20 19:57:44.823846 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.823818 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zdlvd_9b4f6ab3-fddd-446f-8cbf-e372e1b901fe/network-metrics-daemon/0.log" Apr 20 19:57:44.841647 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:44.841617 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zdlvd_9b4f6ab3-fddd-446f-8cbf-e372e1b901fe/kube-rbac-proxy/0.log" Apr 20 19:57:45.949143 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:45.949116 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-controller/0.log" Apr 20 19:57:45.967277 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:45.967255 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/0.log" Apr 20 19:57:45.977815 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:45.977798 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovn-acl-logging/1.log" Apr 20 19:57:45.995679 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:45.995658 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/kube-rbac-proxy-node/0.log" Apr 20 19:57:46.013775 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:46.013753 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 19:57:46.029706 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:46.029686 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/northd/0.log" Apr 20 19:57:46.048735 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:46.048721 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/nbdb/0.log" Apr 20 19:57:46.068588 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:46.068562 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/sbdb/0.log" Apr 20 19:57:46.161118 ip-10-0-129-98 kubenswrapper[2564]: I0420 19:57:46.161094 2564 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-f5z9f_e0fd7043-be2f-4ea6-8e8e-0c1ad9b57cf7/ovnkube-controller/0.log"