Apr 20 19:22:53.977512 ip-10-0-132-159 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 19:22:53.977523 ip-10-0-132-159 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 19:22:53.977534 ip-10-0-132-159 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 19:22:53.977846 ip-10-0-132-159 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 19:23:04.164104 ip-10-0-132-159 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 19:23:04.164133 ip-10-0-132-159 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot fa725f84eeab49cabb300c9b5b35be66 -- Apr 20 19:25:17.161851 ip-10-0-132-159 systemd[1]: Starting Kubernetes Kubelet... Apr 20 19:25:17.730152 ip-10-0-132-159 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:17.730152 ip-10-0-132-159 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 19:25:17.730152 ip-10-0-132-159 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:17.730152 ip-10-0-132-159 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 19:25:17.730152 ip-10-0-132-159 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 19:25:17.731546 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.731446 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 19:25:17.734613 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734595 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:17.734613 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734612 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:17.734613 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734615 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734619 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734622 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734625 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734627 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734630 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734634 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734637 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734639 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734642 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734645 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734647 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734650 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734653 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734657 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734660 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734662 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734665 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734668 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:17.734702 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734672 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734676 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734679 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734683 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734686 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734689 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734692 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734696 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734699 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734701 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734705 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734708 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734711 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734714 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734717 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734720 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734722 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734725 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734729 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734731 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:17.735188 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734734 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734737 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734740 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734742 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734745 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734747 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734750 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734753 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734757 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734760 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734762 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734765 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734767 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734770 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734773 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734776 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734778 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734781 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734783 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734786 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:17.735948 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734788 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734791 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734794 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734798 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734801 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734803 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734806 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734808 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734811 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734814 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734816 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734819 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734822 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734824 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734827 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734830 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734832 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734836 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734839 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:17.736440 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734841 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:17.737016 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734844 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:17.737016 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734847 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:17.737016 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734849 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:17.737016 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734852 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:17.737016 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.734854 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:17.737470 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737451 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:17.737470 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737470 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737476 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737482 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737486 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737490 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737494 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737498 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737502 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737506 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737511 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737515 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737519 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737523 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737527 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737531 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737535 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737540 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737544 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737548 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737552 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:17.737609 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737570 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737575 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737579 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737583 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737592 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737598 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737602 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737606 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737612 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737617 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737622 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737626 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737630 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737634 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737638 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737642 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737646 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737650 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737654 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737659 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:17.738464 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737663 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737669 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737676 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737681 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737685 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737690 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737694 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737699 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737704 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737708 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737712 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737716 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737720 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737725 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737730 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737734 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737738 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737743 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737747 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:17.739361 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737751 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737755 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737760 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737766 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737771 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737775 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737779 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737783 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737788 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737792 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737796 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737800 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737804 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737809 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737814 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737818 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737822 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737828 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737832 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737836 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:17.739882 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737840 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737844 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737848 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737852 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737856 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.737861 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.737979 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.737991 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738003 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738009 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738017 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738022 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738029 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738036 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738042 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738047 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738052 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738058 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738063 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738068 2570 flags.go:64] FLAG: --cgroup-root="" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738072 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738077 2570 flags.go:64] FLAG: --client-ca-file="" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738082 2570 flags.go:64] FLAG: --cloud-config="" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738086 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 20 19:25:17.740460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738091 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738098 2570 flags.go:64] FLAG: --cluster-domain="" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738103 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738108 2570 flags.go:64] FLAG: --config-dir="" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738113 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738118 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738125 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738130 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738136 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738141 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738146 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738151 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738155 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738160 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738165 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738173 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738178 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738182 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738187 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738192 2570 flags.go:64] FLAG: --enable-server="true" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738197 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738204 2570 flags.go:64] FLAG: --event-burst="100" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738209 2570 flags.go:64] FLAG: --event-qps="50" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738215 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738220 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 19:25:17.741182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738225 2570 flags.go:64] FLAG: --eviction-hard="" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738231 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738236 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738241 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738246 2570 flags.go:64] FLAG: --eviction-soft="" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738251 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738256 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738260 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738265 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738270 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738275 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738280 2570 flags.go:64] FLAG: --feature-gates="" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738286 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738291 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738298 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738304 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738309 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738315 2570 flags.go:64] FLAG: --help="false" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738320 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-132-159.ec2.internal" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738325 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738330 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738340 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738346 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 19:25:17.741965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738352 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738357 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738362 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738366 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738371 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738376 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738381 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738386 2570 flags.go:64] FLAG: --kube-reserved="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738391 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738395 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738400 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738405 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738410 2570 flags.go:64] FLAG: --lock-file="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738414 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738419 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738424 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738434 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738439 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738443 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738448 2570 flags.go:64] FLAG: --logging-format="text" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738453 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738458 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738463 2570 flags.go:64] FLAG: --manifest-url="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738468 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738476 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 19:25:17.742517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738481 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738489 2570 flags.go:64] FLAG: --max-pods="110" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738494 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738499 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738504 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738511 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738516 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738520 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738525 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738538 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738543 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738548 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738553 2570 flags.go:64] FLAG: --pod-cidr="" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738579 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738588 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738592 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738598 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738602 2570 flags.go:64] FLAG: --port="10250" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738607 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738612 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0872085c74458af53" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738617 2570 flags.go:64] FLAG: --qos-reserved="" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738622 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738627 2570 flags.go:64] FLAG: --register-node="true" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738632 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 20 19:25:17.743263 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738636 2570 flags.go:64] FLAG: --register-with-taints="" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738642 2570 flags.go:64] FLAG: --registry-burst="10" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738647 2570 flags.go:64] FLAG: --registry-qps="5" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738652 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738657 2570 flags.go:64] FLAG: --reserved-memory="" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738663 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738668 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738673 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738679 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738685 2570 flags.go:64] FLAG: --runonce="false" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738690 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738695 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738700 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738705 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738712 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738717 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738722 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738727 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738732 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738736 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738741 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738746 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738751 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738756 2570 flags.go:64] FLAG: --system-cgroups="" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738760 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 19:25:17.743893 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738769 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738774 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738778 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738784 2570 flags.go:64] FLAG: --tls-min-version="" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738789 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738794 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738799 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738804 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738808 2570 flags.go:64] FLAG: --v="2" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738815 2570 flags.go:64] FLAG: --version="false" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738821 2570 flags.go:64] FLAG: --vmodule="" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738829 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.738835 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.738977 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.738985 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.738992 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.738999 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739004 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739008 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739013 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739018 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739025 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:17.744514 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739030 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739035 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739039 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739044 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739048 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739053 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739057 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739061 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739065 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739070 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739075 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739079 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739083 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739087 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739091 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739096 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739100 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739104 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739108 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739112 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:17.745068 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739117 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739121 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739126 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739131 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739135 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739139 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739144 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739149 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739154 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739158 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739163 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739169 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739173 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739177 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739181 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739186 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739190 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739194 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739198 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739203 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:17.746694 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739207 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739211 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739215 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739219 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739224 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739228 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739232 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739236 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739241 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739245 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739249 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739254 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739258 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739262 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739266 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739270 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739274 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739278 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739286 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:17.747545 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739293 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739298 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739303 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739307 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739313 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739317 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739321 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739325 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739330 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739334 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739338 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739343 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739347 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739352 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739356 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739360 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739364 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:17.748289 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.739368 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:17.748834 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.740158 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:17.748834 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.748715 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 19:25:17.748834 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.748740 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 19:25:17.748834 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748824 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:17.748834 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748833 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:17.748834 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748837 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748842 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748847 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748851 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748856 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748860 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748865 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748869 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748873 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748878 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748883 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748888 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748892 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748896 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748900 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748905 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748909 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748914 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748918 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748922 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:17.749119 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748927 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748931 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748935 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748939 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748943 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748947 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748952 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748957 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748962 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748966 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748970 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748977 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748983 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748987 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748993 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.748998 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749003 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749007 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749012 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:17.750072 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749017 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749022 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749027 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749031 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749037 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749042 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749047 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749051 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749056 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749063 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749070 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749074 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749078 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749083 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749087 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749092 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749096 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749100 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749104 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:17.750540 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749108 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749112 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749116 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749121 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749125 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749129 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749133 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749138 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749143 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749147 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749152 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749156 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749160 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749165 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749169 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749174 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749178 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749182 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749188 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749193 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:17.751085 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749197 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749202 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749207 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749211 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749215 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749220 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.749228 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749403 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749411 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749415 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749420 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749424 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749428 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749432 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749436 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 19:25:17.751892 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749441 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749445 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749449 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749454 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749458 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749463 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749467 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749471 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749476 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749480 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749484 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749488 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749493 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749497 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749501 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749505 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749510 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749515 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749519 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749522 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 19:25:17.752528 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749528 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749535 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749540 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749544 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749548 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749552 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749585 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749589 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749593 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749597 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749601 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749605 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749610 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749614 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749618 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749621 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749625 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749629 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749632 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749636 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 19:25:17.753160 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749640 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749644 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749648 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749651 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749655 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749659 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749662 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749667 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749670 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749675 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749679 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749684 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749687 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749692 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749696 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749700 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749704 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749708 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749712 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749716 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 19:25:17.753797 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749721 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749725 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749730 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749734 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749739 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749744 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749747 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749752 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749756 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749760 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749764 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749769 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749775 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749781 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749785 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749790 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749795 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 19:25:17.754412 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:17.749800 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 19:25:17.754853 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.749808 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 19:25:17.754853 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.750585 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 19:25:17.754853 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.753976 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 19:25:17.755121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.755107 2570 server.go:1019] "Starting client certificate rotation" Apr 20 19:25:17.755224 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.755206 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:25:17.755260 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.755246 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 19:25:17.786492 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.786460 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:25:17.789446 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.789416 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 19:25:17.808032 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.807854 2570 log.go:25] "Validated CRI v1 runtime API" Apr 20 19:25:17.814742 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.814722 2570 log.go:25] "Validated CRI v1 image API" Apr 20 19:25:17.816007 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.815986 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 19:25:17.819436 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.819415 2570 fs.go:135] Filesystem UUIDs: map[3745da6e-be71-47ed-b5d7-91009974a7f2:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 a426382f-bdf1-4e20-ba72-b7124c1df6ff:/dev/nvme0n1p3] Apr 20 19:25:17.819502 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.819435 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 19:25:17.820881 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.820862 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:25:17.825478 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.825366 2570 manager.go:217] Machine: {Timestamp:2026-04-20 19:25:17.823219173 +0000 UTC m=+0.515228103 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100294 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27e8ce7c34286247fc6a970065706b SystemUUID:ec27e8ce-7c34-2862-47fc-6a970065706b BootID:fa725f84-eeab-49ca-bb30-0c9b5b35be66 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:7b:c9:03:6b:3d Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:7b:c9:03:6b:3d Speed:0 Mtu:9001} {Name:ovs-system MacAddress:0e:b6:69:a5:da:77 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 19:25:17.825478 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.825471 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 19:25:17.825627 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.825554 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 19:25:17.829407 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.829380 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 19:25:17.829553 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.829410 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-159.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 19:25:17.829619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.829576 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 19:25:17.829619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.829586 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 19:25:17.829619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.829599 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:25:17.830601 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.830590 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 19:25:17.832231 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.832221 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:25:17.832345 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.832337 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 19:25:17.837113 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.837101 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 20 19:25:17.837170 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.837117 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 19:25:17.837170 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.837132 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 19:25:17.837170 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.837142 2570 kubelet.go:397] "Adding apiserver pod source" Apr 20 19:25:17.837170 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.837153 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 19:25:17.839325 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.839312 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:25:17.839371 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.839330 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 19:25:17.843225 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.843210 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 19:25:17.845019 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.844988 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 19:25:17.847100 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847088 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847106 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847112 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847118 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847124 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847129 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847135 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847141 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 19:25:17.847147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847147 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 19:25:17.847352 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847153 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 19:25:17.847352 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847167 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 19:25:17.847352 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.847176 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 19:25:17.848269 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.848259 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 19:25:17.848303 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.848270 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 19:25:17.850303 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.850271 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-159.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 19:25:17.850409 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.850344 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 19:25:17.852074 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.852061 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 19:25:17.852141 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.852098 2570 server.go:1295] "Started kubelet" Apr 20 19:25:17.852243 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.852201 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 19:25:17.852344 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.852290 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 19:25:17.852427 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.852366 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 19:25:17.853009 ip-10-0-132-159 systemd[1]: Started Kubernetes Kubelet. Apr 20 19:25:17.853639 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.853443 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 19:25:17.855422 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.855407 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 20 19:25:17.858681 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.858665 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-132-159.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 19:25:17.860381 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.860358 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 19:25:17.860496 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.858599 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-159.ec2.internal.18a8272ad1598003 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-159.ec2.internal,UID:ip-10-0-132-159.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-132-159.ec2.internal,},FirstTimestamp:2026-04-20 19:25:17.852073987 +0000 UTC m=+0.544082917,LastTimestamp:2026-04-20 19:25:17.852073987 +0000 UTC m=+0.544082917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-159.ec2.internal,}" Apr 20 19:25:17.861985 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.861959 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.863550 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.863726 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.863743 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.863856 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.863947 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.863963 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864112 2570 factory.go:55] Registering systemd factory Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864136 2570 factory.go:223] Registration of the systemd container factory successfully Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864375 2570 factory.go:153] Registering CRI-O factory Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864390 2570 factory.go:223] Registration of the crio container factory successfully Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864447 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864466 2570 factory.go:103] Registering Raw factory Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.864482 2570 manager.go:1196] Started watching for new ooms in manager Apr 20 19:25:17.865129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.865024 2570 manager.go:319] Starting recovery of all containers Apr 20 19:25:17.867146 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.867124 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ftkcb" Apr 20 19:25:17.869514 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.869475 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 19:25:17.869923 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.869877 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 20 19:25:17.870011 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.869995 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-132-159.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 20 19:25:17.874203 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.874037 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-ftkcb" Apr 20 19:25:17.875462 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.875443 2570 manager.go:324] Recovery completed Apr 20 19:25:17.880085 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.880063 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:17.882785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.882769 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:17.882876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.882803 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:17.882876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.882816 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:17.883297 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.883280 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 19:25:17.883297 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.883297 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 19:25:17.883417 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.883315 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 20 19:25:17.884638 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.884555 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-132-159.ec2.internal.18a8272ad32e1ad1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-132-159.ec2.internal,UID:ip-10-0-132-159.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-132-159.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-132-159.ec2.internal,},FirstTimestamp:2026-04-20 19:25:17.882784465 +0000 UTC m=+0.574793395,LastTimestamp:2026-04-20 19:25:17.882784465 +0000 UTC m=+0.574793395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-132-159.ec2.internal,}" Apr 20 19:25:17.885548 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.885533 2570 policy_none.go:49] "None policy: Start" Apr 20 19:25:17.885644 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.885553 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 19:25:17.885644 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.885583 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 20 19:25:17.911237 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911220 2570 manager.go:341] "Starting Device Plugin manager" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.911264 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911280 2570 server.go:85] "Starting device plugin registration server" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911551 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911579 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911676 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911750 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.911758 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.912392 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 19:25:17.926414 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.912442 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:17.956625 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.956586 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 19:25:17.957800 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.957784 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 19:25:17.957874 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.957812 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 19:25:17.957874 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.957831 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 19:25:17.957874 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.957837 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 19:25:17.957874 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:17.957871 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 19:25:17.961130 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:17.961110 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:18.013191 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.013110 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:18.014251 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.014235 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:18.014354 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.014263 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:18.014354 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.014273 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:18.014354 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.014298 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.022799 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.022785 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.022844 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.022806 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-159.ec2.internal\": node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.039598 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.039573 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.058284 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.058261 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal"] Apr 20 19:25:18.058363 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.058344 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:18.059150 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.059133 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:18.059227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.059164 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:18.059227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.059176 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:18.060320 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.060307 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:18.060474 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.060458 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.060533 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.060496 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:18.061024 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.061007 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:18.061024 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.061019 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:18.061130 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.061035 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:18.061130 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.061050 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:18.061190 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.061038 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:18.061190 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.061184 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:18.062651 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.062632 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.062735 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.062659 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 19:25:18.063302 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.063288 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientMemory" Apr 20 19:25:18.063368 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.063313 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 19:25:18.063368 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.063325 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeHasSufficientPID" Apr 20 19:25:18.066031 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.065615 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.066031 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.065656 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.066031 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.065682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baec555e5e2b442b2cad3d99698ce3db-config\") pod \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" (UID: \"baec555e5e2b442b2cad3d99698ce3db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.089307 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.089281 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-159.ec2.internal\" not found" node="ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.093898 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.093880 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-159.ec2.internal\" not found" node="ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.139968 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.139935 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.165862 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.165833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baec555e5e2b442b2cad3d99698ce3db-config\") pod \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" (UID: \"baec555e5e2b442b2cad3d99698ce3db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.165862 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.165862 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.166073 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.165878 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.166073 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.165927 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.166073 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.165935 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/baec555e5e2b442b2cad3d99698ce3db-config\") pod \"kube-apiserver-proxy-ip-10-0-132-159.ec2.internal\" (UID: \"baec555e5e2b442b2cad3d99698ce3db\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.166073 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.165961 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/906f6cca8711ebeed3b778a79317b11c-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal\" (UID: \"906f6cca8711ebeed3b778a79317b11c\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.241023 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.240970 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.341665 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.341593 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.391174 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.391152 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.396828 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.396811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 20 19:25:18.442369 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.442332 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.542932 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.542900 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.643507 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.643416 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.744097 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.744068 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.755248 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.755223 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 19:25:18.755403 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.755386 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 19:25:18.790883 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.790857 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:18.844490 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.844447 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.860735 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.860711 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 19:25:18.873551 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.873527 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 19:25:18.876727 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.876674 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 19:20:17 +0000 UTC" deadline="2027-09-18 12:34:26.828950667 +0000 UTC" Apr 20 19:25:18.876727 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.876717 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12377h9m7.952237819s" Apr 20 19:25:18.897846 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.897785 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-qn7xr" Apr 20 19:25:18.906164 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.906141 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-qn7xr" Apr 20 19:25:18.944677 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:18.944648 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:18.960392 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:18.960361 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaec555e5e2b442b2cad3d99698ce3db.slice/crio-28e8067cc1890a724431460e54fcfd907d1b0bf61c53d2e3119c96f382c406df WatchSource:0}: Error finding container 28e8067cc1890a724431460e54fcfd907d1b0bf61c53d2e3119c96f382c406df: Status 404 returned error can't find the container with id 28e8067cc1890a724431460e54fcfd907d1b0bf61c53d2e3119c96f382c406df Apr 20 19:25:18.960861 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:18.960840 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod906f6cca8711ebeed3b778a79317b11c.slice/crio-6b2235e9c268e297b963b1019a0c46f754d3373d8affee6172a31160a199a499 WatchSource:0}: Error finding container 6b2235e9c268e297b963b1019a0c46f754d3373d8affee6172a31160a199a499: Status 404 returned error can't find the container with id 6b2235e9c268e297b963b1019a0c46f754d3373d8affee6172a31160a199a499 Apr 20 19:25:18.964588 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:18.964554 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:25:19.044827 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.044779 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:19.145350 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.145312 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-159.ec2.internal\" not found" Apr 20 19:25:19.236875 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.236793 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:19.245206 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.245187 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:19.263195 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.263169 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" Apr 20 19:25:19.276454 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.276432 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:25:19.277729 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.277718 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" Apr 20 19:25:19.288542 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.288525 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 19:25:19.838454 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.838422 2570 apiserver.go:52] "Watching apiserver" Apr 20 19:25:19.846512 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.846479 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 19:25:19.847788 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.847726 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-cvbdc","openshift-network-diagnostics/network-check-target-bnw57","openshift-network-operator/iptables-alerter-v9jk7","kube-system/konnectivity-agent-s2wnw","kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw","openshift-cluster-node-tuning-operator/tuned-tnmw5","openshift-ovn-kubernetes/ovnkube-node-9rxk8","openshift-image-registry/node-ca-d2857","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal","openshift-multus/multus-additional-cni-plugins-p5rrh","openshift-multus/multus-jfcc7"] Apr 20 19:25:19.850608 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.850583 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:19.850717 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.850688 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:19.852091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.852067 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:19.852213 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.852146 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:19.853223 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.853207 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.853705 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.853395 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.854504 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.854483 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.856001 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.855673 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.856001 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.855684 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-lr9cl\"" Apr 20 19:25:19.857582 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.857546 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859365 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859407 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859407 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859669 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859704 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859710 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859810 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.859964 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.860047 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-ds5nn\"" Apr 20 19:25:19.860227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.860133 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 19:25:19.860761 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.860290 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 19:25:19.861250 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.861229 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wsd5d\"" Apr 20 19:25:19.861348 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.861296 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 19:25:19.862001 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.861979 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-66qhr\"" Apr 20 19:25:19.862636 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.862124 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 19:25:19.862636 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.862259 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.863181 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.863164 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-xnv4f\"" Apr 20 19:25:19.863550 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.863423 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.863550 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.863452 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.864898 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.864881 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 19:25:19.874240 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.874219 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 19:25:19.874641 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.874622 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 19:25:19.874735 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.874622 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 19:25:19.874735 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.874713 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 19:25:19.874843 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.874640 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 19:25:19.875197 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875176 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qp9k\" (UniqueName: \"kubernetes.io/projected/220904c2-fd82-44e5-9904-33aeca86dcee-kube-api-access-5qp9k\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:19.875294 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875205 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-kubelet\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875294 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875229 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-slash\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875294 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875252 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-cni-netd\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875320 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmhkk\" (UniqueName: \"kubernetes.io/projected/a587ed5a-1c69-439a-a673-9e3e5479ec27-kube-api-access-bmhkk\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875359 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-iptables-alerter-script\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875379 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875389 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zq2bx\" (UniqueName: \"kubernetes.io/projected/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-kube-api-access-zq2bx\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875380 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875414 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875428 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-btbzq\"" Apr 20 19:25:19.875452 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875438 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-cni-multus\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7b91853c-4ad1-4d29-b26c-3742343d8630-agent-certs\") pod \"konnectivity-agent-s2wnw\" (UID: \"7b91853c-4ad1-4d29-b26c-3742343d8630\") " pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875581 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875583 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-lib-modules\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875628 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tdn8l\" (UniqueName: \"kubernetes.io/projected/fac78af9-e092-4651-832b-71685148f129-kube-api-access-tdn8l\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875652 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-log-socket\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875678 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875687 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-cni-bin\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875709 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-env-overrides\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875733 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-socket-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875757 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-registration-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875780 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-cni-binary-copy\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875790 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8qcr9\"" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875803 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-system-cni-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.875830 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875826 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-etc-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-sys-fs\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875901 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-socket-dir-parent\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875924 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875932 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovnkube-script-lib\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875960 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7q7t2\" (UniqueName: \"kubernetes.io/projected/52436de9-f8c9-44ae-b773-c086491e71ad-kube-api-access-7q7t2\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.875986 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-system-cni-dir\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876009 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-hostroot\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876033 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-host-slash\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876060 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-systemd-units\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876084 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-systemd\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-serviceca\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876131 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-device-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876153 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876179 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-conf-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a587ed5a-1c69-439a-a673-9e3e5479ec27-cni-binary-copy\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-run\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.876414 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876243 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-var-lib-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876260 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-modprobe-d\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysconfig\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-kubernetes\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876315 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-sys\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876334 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-run-netns\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876358 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-daemon-config\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876657 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-multus-certs\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876698 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-ovn\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876724 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdk9x\" (UniqueName: \"kubernetes.io/projected/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-kube-api-access-jdk9x\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876762 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-netns\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876785 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-k8s-cni-cncf-io\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876787 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-xrthd\"" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876837 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.876843 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877041 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysctl-d\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877083 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysctl-conf\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877115 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fac78af9-e092-4651-832b-71685148f129-etc-tuned\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.877213 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877151 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877238 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovnkube-config\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877271 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcsk2\" (UniqueName: \"kubernetes.io/projected/0481b01c-d63f-4503-aacf-fcdb030d79e9-kube-api-access-lcsk2\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877300 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-os-release\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877325 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877403 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7b91853c-4ad1-4d29-b26c-3742343d8630-konnectivity-ca\") pod \"konnectivity-agent-s2wnw\" (UID: \"7b91853c-4ad1-4d29-b26c-3742343d8630\") " pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877433 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-systemd\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877458 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-node-log\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-host\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877510 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877538 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-cni-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877552 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-var-lib-kubelet\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877585 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7wmb\" (UniqueName: \"kubernetes.io/projected/414b2f51-6741-47fc-9528-f08b5a635fba-kube-api-access-w7wmb\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-cni-bin\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877647 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-etc-kubernetes\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.878065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877670 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877718 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877751 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-cnibin\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877776 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-os-release\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877813 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-cnibin\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877850 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-kubelet\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877875 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877900 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-host\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877934 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fac78af9-e092-4651-832b-71685148f129-tmp\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.877959 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.878622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.878003 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovn-node-metrics-cert\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.906948 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.906914 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:20:18 +0000 UTC" deadline="2027-09-19 06:24:11.555184175 +0000 UTC" Apr 20 19:25:19.906948 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.906946 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12394h58m51.648241703s" Apr 20 19:25:19.962275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.962221 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" event={"ID":"906f6cca8711ebeed3b778a79317b11c","Type":"ContainerStarted","Data":"6b2235e9c268e297b963b1019a0c46f754d3373d8affee6172a31160a199a499"} Apr 20 19:25:19.963316 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.963289 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" event={"ID":"baec555e5e2b442b2cad3d99698ce3db","Type":"ContainerStarted","Data":"28e8067cc1890a724431460e54fcfd907d1b0bf61c53d2e3119c96f382c406df"} Apr 20 19:25:19.978407 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978379 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fac78af9-e092-4651-832b-71685148f129-tmp\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.978526 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978412 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978526 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovn-node-metrics-cert\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978526 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5qp9k\" (UniqueName: \"kubernetes.io/projected/220904c2-fd82-44e5-9904-33aeca86dcee-kube-api-access-5qp9k\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:19.978526 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978467 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-kubelet\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978526 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-slash\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978585 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-cni-netd\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978597 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-kubelet\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978617 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-slash\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bmhkk\" (UniqueName: \"kubernetes.io/projected/a587ed5a-1c69-439a-a673-9e3e5479ec27-kube-api-access-bmhkk\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978655 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-cni-netd\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-iptables-alerter-script\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978690 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zq2bx\" (UniqueName: \"kubernetes.io/projected/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-kube-api-access-zq2bx\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978736 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-cni-multus\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978762 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7b91853c-4ad1-4d29-b26c-3742343d8630-agent-certs\") pod \"konnectivity-agent-s2wnw\" (UID: \"7b91853c-4ad1-4d29-b26c-3742343d8630\") " pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.978798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978787 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-lib-modules\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978837 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-cni-multus\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978895 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978909 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tdn8l\" (UniqueName: \"kubernetes.io/projected/fac78af9-e092-4651-832b-71685148f129-kube-api-access-tdn8l\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978930 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-lib-modules\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978936 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-log-socket\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978958 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-cni-bin\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-env-overrides\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.978993 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-socket-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979014 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-registration-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979029 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-cni-binary-copy\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-system-cni-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979060 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-etc-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979080 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-sys-fs\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979100 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-socket-dir-parent\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979121 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovnkube-script-lib\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.979322 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979147 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7q7t2\" (UniqueName: \"kubernetes.io/projected/52436de9-f8c9-44ae-b773-c086491e71ad-kube-api-access-7q7t2\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979165 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-registration-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979183 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-system-cni-dir\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979181 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-etc-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979212 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-log-socket\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979246 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-iptables-alerter-script\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979265 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-hostroot\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979299 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-host-slash\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979315 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-sys-fs\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979326 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-systemd-units\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979349 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-system-cni-dir\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-hostroot\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979351 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-systemd\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979381 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-cni-bin\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979390 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-host-slash\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979501 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-system-cni-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979530 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-serviceca\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979554 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-socket-dir-parent\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979579 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-device-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979648 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-device-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979675 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-systemd\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979686 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-conf-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979708 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-cni-binary-copy\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979713 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a587ed5a-1c69-439a-a673-9e3e5479ec27-cni-binary-copy\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-conf-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-run\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-var-lib-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-modprobe-d\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979882 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysconfig\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979926 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-kubernetes\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979929 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-env-overrides\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-sys\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979968 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovnkube-script-lib\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979973 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-var-lib-openvswitch\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.979974 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-run-netns\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.980919 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980012 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-run\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980021 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-kubernetes\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980028 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-daemon-config\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-run-netns\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980057 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-multus-certs\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-serviceca\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980095 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-modprobe-d\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980129 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-sys\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980119 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysconfig\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980157 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-systemd-units\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-ovn\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980178 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-multus-certs\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jdk9x\" (UniqueName: \"kubernetes.io/projected/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-kube-api-access-jdk9x\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980222 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-netns\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980255 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-socket-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980296 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-k8s-cni-cncf-io\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980327 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysctl-d\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980298 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-run-ovn\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.981751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-netns\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980382 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-run-k8s-cni-cncf-io\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysctl-conf\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980435 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysctl-d\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980457 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fac78af9-e092-4651-832b-71685148f129-etc-tuned\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980526 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980573 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-sysctl-conf\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980586 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovnkube-config\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980662 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcsk2\" (UniqueName: \"kubernetes.io/projected/0481b01c-d63f-4503-aacf-fcdb030d79e9-kube-api-access-lcsk2\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980692 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-os-release\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980745 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7b91853c-4ad1-4d29-b26c-3742343d8630-konnectivity-ca\") pod \"konnectivity-agent-s2wnw\" (UID: \"7b91853c-4ad1-4d29-b26c-3742343d8630\") " pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.980792 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-systemd\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981274 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/7b91853c-4ad1-4d29-b26c-3742343d8630-konnectivity-ca\") pod \"konnectivity-agent-s2wnw\" (UID: \"7b91853c-4ad1-4d29-b26c-3742343d8630\") " pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-os-release\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.982593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981367 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a587ed5a-1c69-439a-a673-9e3e5479ec27-cni-binary-copy\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981377 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-node-log\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981395 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-daemon-config\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-host\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/0481b01c-d63f-4503-aacf-fcdb030d79e9-node-log\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981450 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-etc-systemd\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981491 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981539 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-host\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981542 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-kubelet-dir\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981540 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovnkube-config\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-cni-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981625 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-var-lib-kubelet\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981674 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-multus-cni-dir\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981710 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7wmb\" (UniqueName: \"kubernetes.io/projected/414b2f51-6741-47fc-9528-f08b5a635fba-kube-api-access-w7wmb\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981716 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-var-lib-kubelet\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981737 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-cni-bin\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-etc-kubernetes\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.983317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981830 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-cni-bin\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981833 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/52436de9-f8c9-44ae-b773-c086491e71ad-etc-selinux\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981921 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-cnibin\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981947 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-os-release\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.981979 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-cnibin\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-kubelet\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982093 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-host\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982168 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fac78af9-e092-4651-832b-71685148f129-host\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982175 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/414b2f51-6741-47fc-9528-f08b5a635fba-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982180 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-cnibin\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-os-release\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982250 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-etc-kubernetes\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982248 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-host-var-lib-kubelet\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.982394 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:19.984038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982454 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a587ed5a-1c69-439a-a673-9e3e5479ec27-cnibin\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.984613 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.982482 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:25:20.482465793 +0000 UTC m=+3.174474714 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:19.984613 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982626 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/414b2f51-6741-47fc-9528-f08b5a635fba-tuning-conf-dir\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.984613 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.982916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/0481b01c-d63f-4503-aacf-fcdb030d79e9-ovn-node-metrics-cert\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.984613 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.983055 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/7b91853c-4ad1-4d29-b26c-3742343d8630-agent-certs\") pod \"konnectivity-agent-s2wnw\" (UID: \"7b91853c-4ad1-4d29-b26c-3742343d8630\") " pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:19.984613 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.983430 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fac78af9-e092-4651-832b-71685148f129-tmp\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.984613 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.983670 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/fac78af9-e092-4651-832b-71685148f129-etc-tuned\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:19.989856 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.989832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmhkk\" (UniqueName: \"kubernetes.io/projected/a587ed5a-1c69-439a-a673-9e3e5479ec27-kube-api-access-bmhkk\") pod \"multus-jfcc7\" (UID: \"a587ed5a-1c69-439a-a673-9e3e5479ec27\") " pod="openshift-multus/multus-jfcc7" Apr 20 19:25:19.989971 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.989957 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:19.990032 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.989981 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:19.990032 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.989996 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:19.990129 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:19.990059 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:20.490041516 +0000 UTC m=+3.182050456 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:19.991873 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.991820 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7wmb\" (UniqueName: \"kubernetes.io/projected/414b2f51-6741-47fc-9528-f08b5a635fba-kube-api-access-w7wmb\") pod \"multus-additional-cni-plugins-p5rrh\" (UID: \"414b2f51-6741-47fc-9528-f08b5a635fba\") " pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:19.992987 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.992963 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdk9x\" (UniqueName: \"kubernetes.io/projected/d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7-kube-api-access-jdk9x\") pod \"node-ca-d2857\" (UID: \"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7\") " pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:19.993327 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.993309 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zq2bx\" (UniqueName: \"kubernetes.io/projected/5a9f2bb3-2690-4fd5-a45a-562b7167cd70-kube-api-access-zq2bx\") pod \"iptables-alerter-v9jk7\" (UID: \"5a9f2bb3-2690-4fd5-a45a-562b7167cd70\") " pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:19.993451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.993420 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcsk2\" (UniqueName: \"kubernetes.io/projected/0481b01c-d63f-4503-aacf-fcdb030d79e9-kube-api-access-lcsk2\") pod \"ovnkube-node-9rxk8\" (UID: \"0481b01c-d63f-4503-aacf-fcdb030d79e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:19.993451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.993388 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qp9k\" (UniqueName: \"kubernetes.io/projected/220904c2-fd82-44e5-9904-33aeca86dcee-kube-api-access-5qp9k\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:19.993618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.993310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7q7t2\" (UniqueName: \"kubernetes.io/projected/52436de9-f8c9-44ae-b773-c086491e71ad-kube-api-access-7q7t2\") pod \"aws-ebs-csi-driver-node-6vmcw\" (UID: \"52436de9-f8c9-44ae-b773-c086491e71ad\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:19.994523 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.994506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tdn8l\" (UniqueName: \"kubernetes.io/projected/fac78af9-e092-4651-832b-71685148f129-kube-api-access-tdn8l\") pod \"tuned-tnmw5\" (UID: \"fac78af9-e092-4651-832b-71685148f129\") " pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:20.000006 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:19.999988 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 19:25:20.170431 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.170359 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-v9jk7" Apr 20 19:25:20.178176 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.178144 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d2857" Apr 20 19:25:20.185872 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.185852 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:20.191474 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.191454 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" Apr 20 19:25:20.197079 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.197055 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" Apr 20 19:25:20.203654 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.203630 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:20.209185 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.209168 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" Apr 20 19:25:20.213706 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.213678 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jfcc7" Apr 20 19:25:20.485662 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.485571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:20.485818 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.485717 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:20.485818 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.485793 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:25:21.485771723 +0000 UTC m=+4.177780642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:20.586410 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.586375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:20.586603 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.586576 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:20.586603 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.586601 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:20.586690 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.586612 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:20.586690 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.586678 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:21.586658526 +0000 UTC m=+4.278667462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:20.694215 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.694184 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b91853c_4ad1_4d29_b26c_3742343d8630.slice/crio-80e53647ed3c033df5174033093051617b04d7c13e772cf9d7f903b0adb2fd84 WatchSource:0}: Error finding container 80e53647ed3c033df5174033093051617b04d7c13e772cf9d7f903b0adb2fd84: Status 404 returned error can't find the container with id 80e53647ed3c033df5174033093051617b04d7c13e772cf9d7f903b0adb2fd84 Apr 20 19:25:20.695397 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.695357 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a9f2bb3_2690_4fd5_a45a_562b7167cd70.slice/crio-761ddc94530c75e5f13a45f6bb249d75709eceb0a16f100b07f1003593f8baeb WatchSource:0}: Error finding container 761ddc94530c75e5f13a45f6bb249d75709eceb0a16f100b07f1003593f8baeb: Status 404 returned error can't find the container with id 761ddc94530c75e5f13a45f6bb249d75709eceb0a16f100b07f1003593f8baeb Apr 20 19:25:20.696012 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.695874 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod414b2f51_6741_47fc_9528_f08b5a635fba.slice/crio-cddab2a39775c3eea45bb196948f60d2906b114a107a01ce9129c42b0b684f56 WatchSource:0}: Error finding container cddab2a39775c3eea45bb196948f60d2906b114a107a01ce9129c42b0b684f56: Status 404 returned error can't find the container with id cddab2a39775c3eea45bb196948f60d2906b114a107a01ce9129c42b0b684f56 Apr 20 19:25:20.698384 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.697896 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52436de9_f8c9_44ae_b773_c086491e71ad.slice/crio-2bc4a82efcaea24b6a7b409f62ea1eecbfef531511e9a3ba07a1d8d18bf2a3e2 WatchSource:0}: Error finding container 2bc4a82efcaea24b6a7b409f62ea1eecbfef531511e9a3ba07a1d8d18bf2a3e2: Status 404 returned error can't find the container with id 2bc4a82efcaea24b6a7b409f62ea1eecbfef531511e9a3ba07a1d8d18bf2a3e2 Apr 20 19:25:20.700033 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.700007 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1de0681_7e7f_40b8_aef2_7bbbcf5c25e7.slice/crio-fad2e6ffeb58a31277cf2862bc0a4b1bfcf349f883335fc50306fc388da24f01 WatchSource:0}: Error finding container fad2e6ffeb58a31277cf2862bc0a4b1bfcf349f883335fc50306fc388da24f01: Status 404 returned error can't find the container with id fad2e6ffeb58a31277cf2862bc0a4b1bfcf349f883335fc50306fc388da24f01 Apr 20 19:25:20.701001 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.700976 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda587ed5a_1c69_439a_a673_9e3e5479ec27.slice/crio-99bdcef973e2b8115b04c1df781f40c0d77671f3d5ef152f9131bc44277260e3 WatchSource:0}: Error finding container 99bdcef973e2b8115b04c1df781f40c0d77671f3d5ef152f9131bc44277260e3: Status 404 returned error can't find the container with id 99bdcef973e2b8115b04c1df781f40c0d77671f3d5ef152f9131bc44277260e3 Apr 20 19:25:20.701970 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.701946 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfac78af9_e092_4651_832b_71685148f129.slice/crio-a99d73501409bac089f23fe02b6263a88337531180efa04f2209917d1827ccb0 WatchSource:0}: Error finding container a99d73501409bac089f23fe02b6263a88337531180efa04f2209917d1827ccb0: Status 404 returned error can't find the container with id a99d73501409bac089f23fe02b6263a88337531180efa04f2209917d1827ccb0 Apr 20 19:25:20.704051 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:20.704035 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0481b01c_d63f_4503_aacf_fcdb030d79e9.slice/crio-27263a7eb09c05af6fde381490e333ec8fcb9576c514dfa96a7c2e44d4cca577 WatchSource:0}: Error finding container 27263a7eb09c05af6fde381490e333ec8fcb9576c514dfa96a7c2e44d4cca577: Status 404 returned error can't find the container with id 27263a7eb09c05af6fde381490e333ec8fcb9576c514dfa96a7c2e44d4cca577 Apr 20 19:25:20.907520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.907334 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 19:20:18 +0000 UTC" deadline="2027-12-06 09:18:12.897452083 +0000 UTC" Apr 20 19:25:20.907520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.907516 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14269h52m51.989939889s" Apr 20 19:25:20.958766 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.958735 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:20.958907 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:20.958844 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:20.965938 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.965905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v9jk7" event={"ID":"5a9f2bb3-2690-4fd5-a45a-562b7167cd70","Type":"ContainerStarted","Data":"761ddc94530c75e5f13a45f6bb249d75709eceb0a16f100b07f1003593f8baeb"} Apr 20 19:25:20.967441 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.967415 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" event={"ID":"baec555e5e2b442b2cad3d99698ce3db","Type":"ContainerStarted","Data":"099bb446dd202b0e456fd69b48a929039b35b3cd0e9d46f708b909df8de11911"} Apr 20 19:25:20.968525 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.968500 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"27263a7eb09c05af6fde381490e333ec8fcb9576c514dfa96a7c2e44d4cca577"} Apr 20 19:25:20.969485 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.969466 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jfcc7" event={"ID":"a587ed5a-1c69-439a-a673-9e3e5479ec27","Type":"ContainerStarted","Data":"99bdcef973e2b8115b04c1df781f40c0d77671f3d5ef152f9131bc44277260e3"} Apr 20 19:25:20.970357 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.970331 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerStarted","Data":"cddab2a39775c3eea45bb196948f60d2906b114a107a01ce9129c42b0b684f56"} Apr 20 19:25:20.971270 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.971246 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2wnw" event={"ID":"7b91853c-4ad1-4d29-b26c-3742343d8630","Type":"ContainerStarted","Data":"80e53647ed3c033df5174033093051617b04d7c13e772cf9d7f903b0adb2fd84"} Apr 20 19:25:20.972803 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.972775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" event={"ID":"fac78af9-e092-4651-832b-71685148f129","Type":"ContainerStarted","Data":"a99d73501409bac089f23fe02b6263a88337531180efa04f2209917d1827ccb0"} Apr 20 19:25:20.973657 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.973630 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2857" event={"ID":"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7","Type":"ContainerStarted","Data":"fad2e6ffeb58a31277cf2862bc0a4b1bfcf349f883335fc50306fc388da24f01"} Apr 20 19:25:20.974421 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.974401 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" event={"ID":"52436de9-f8c9-44ae-b773-c086491e71ad","Type":"ContainerStarted","Data":"2bc4a82efcaea24b6a7b409f62ea1eecbfef531511e9a3ba07a1d8d18bf2a3e2"} Apr 20 19:25:20.982673 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:20.982638 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-159.ec2.internal" podStartSLOduration=1.9826282480000001 podStartE2EDuration="1.982628248s" podCreationTimestamp="2026-04-20 19:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:20.982207097 +0000 UTC m=+3.674216033" watchObservedRunningTime="2026-04-20 19:25:20.982628248 +0000 UTC m=+3.674637186" Apr 20 19:25:21.493528 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:21.493501 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:21.493677 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.493653 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:21.493739 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.493710 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:25:23.493691655 +0000 UTC m=+6.185700574 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:21.594058 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:21.594017 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:21.594247 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.594206 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:21.594247 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.594224 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:21.594247 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.594235 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:21.594431 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.594291 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:23.594274789 +0000 UTC m=+6.286283706 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:21.969267 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:21.966627 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:21.969267 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:21.966796 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:21.988955 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:21.988919 2570 generic.go:358] "Generic (PLEG): container finished" podID="906f6cca8711ebeed3b778a79317b11c" containerID="1e48bde93ccad4398f4366dc4d141c83be28c0e432493f464df58a69e468d225" exitCode=0 Apr 20 19:25:21.989837 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:21.989813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" event={"ID":"906f6cca8711ebeed3b778a79317b11c","Type":"ContainerDied","Data":"1e48bde93ccad4398f4366dc4d141c83be28c0e432493f464df58a69e468d225"} Apr 20 19:25:22.958674 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:22.958639 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:22.958867 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:22.958784 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:23.003857 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.003812 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" event={"ID":"906f6cca8711ebeed3b778a79317b11c","Type":"ContainerStarted","Data":"555b076f6ae6c1839bc806c229fa02b8e06d2964dc6d2b2a72fb97dd3960f6d2"} Apr 20 19:25:23.099378 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.099323 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-159.ec2.internal" podStartSLOduration=4.099300887 podStartE2EDuration="4.099300887s" podCreationTimestamp="2026-04-20 19:25:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:23.021743078 +0000 UTC m=+5.713752019" watchObservedRunningTime="2026-04-20 19:25:23.099300887 +0000 UTC m=+5.791309828" Apr 20 19:25:23.100387 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.099878 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-2b57t"] Apr 20 19:25:23.103462 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.103078 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.109080 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.109058 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 19:25:23.109324 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.109308 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 19:25:23.109597 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.109581 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-lfkbv\"" Apr 20 19:25:23.207876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.207836 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gtjlp\" (UniqueName: \"kubernetes.io/projected/1f14825d-bc77-4c61-9be4-8a25e8c7134b-kube-api-access-gtjlp\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.208064 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.207891 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f14825d-bc77-4c61-9be4-8a25e8c7134b-tmp-dir\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.208064 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.207923 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f14825d-bc77-4c61-9be4-8a25e8c7134b-hosts-file\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.309275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.308682 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gtjlp\" (UniqueName: \"kubernetes.io/projected/1f14825d-bc77-4c61-9be4-8a25e8c7134b-kube-api-access-gtjlp\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.309275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.308729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f14825d-bc77-4c61-9be4-8a25e8c7134b-tmp-dir\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.309275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.308758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f14825d-bc77-4c61-9be4-8a25e8c7134b-hosts-file\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.309275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.308898 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/1f14825d-bc77-4c61-9be4-8a25e8c7134b-hosts-file\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.309611 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.309490 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1f14825d-bc77-4c61-9be4-8a25e8c7134b-tmp-dir\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.338684 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.338643 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gtjlp\" (UniqueName: \"kubernetes.io/projected/1f14825d-bc77-4c61-9be4-8a25e8c7134b-kube-api-access-gtjlp\") pod \"node-resolver-2b57t\" (UID: \"1f14825d-bc77-4c61-9be4-8a25e8c7134b\") " pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.417658 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.417623 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-2b57t" Apr 20 19:25:23.510482 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.510446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:23.510678 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.510617 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:23.510748 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.510691 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:25:27.510671812 +0000 UTC m=+10.202680752 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:23.611920 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.611781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:23.612073 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.611964 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:23.612073 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.611984 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:23.612073 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.611996 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:23.612073 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.612060 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:27.612038474 +0000 UTC m=+10.304047481 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:23.962700 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:23.962148 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:23.962700 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:23.962269 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:24.958442 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:24.958400 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:24.958910 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:24.958548 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:25.959936 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:25.959046 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:25.959936 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:25.959174 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:26.785386 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.784998 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9d99g"] Apr 20 19:25:26.788232 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.788185 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.788383 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:26.788268 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:26.839413 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.839225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.839413 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.839269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e48bd3bb-a360-42e2-bee7-064799310567-kubelet-config\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.839413 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.839324 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e48bd3bb-a360-42e2-bee7-064799310567-dbus\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.940526 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.940487 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e48bd3bb-a360-42e2-bee7-064799310567-dbus\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.940702 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.940598 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.940702 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.940631 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e48bd3bb-a360-42e2-bee7-064799310567-kubelet-config\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.940810 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.940767 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/e48bd3bb-a360-42e2-bee7-064799310567-kubelet-config\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.940928 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.940911 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/e48bd3bb-a360-42e2-bee7-064799310567-dbus\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:26.941015 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:26.940972 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:26.941071 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:26.941038 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret podName:e48bd3bb-a360-42e2-bee7-064799310567 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:27.441020132 +0000 UTC m=+10.133029054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret") pod "global-pull-secret-syncer-9d99g" (UID: "e48bd3bb-a360-42e2-bee7-064799310567") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:26.958578 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:26.958114 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:26.958578 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:26.958229 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:27.445633 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:27.445591 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:27.446197 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.445815 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:27.446197 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.445891 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret podName:e48bd3bb-a360-42e2-bee7-064799310567 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:28.445871639 +0000 UTC m=+11.137880563 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret") pod "global-pull-secret-syncer-9d99g" (UID: "e48bd3bb-a360-42e2-bee7-064799310567") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:27.546751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:27.546615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:27.546922 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.546787 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:27.546922 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.546861 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:25:35.546841414 +0000 UTC m=+18.238850334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:27.647708 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:27.647652 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:27.647954 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.647835 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:27.647954 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.647859 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:27.647954 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.647872 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:27.647954 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.647930 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:35.647913246 +0000 UTC m=+18.339922164 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:27.965967 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:27.965613 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:27.965967 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:27.965625 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:27.965967 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.965732 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:27.965967 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:27.965814 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:28.455084 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:28.455045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:28.455587 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:28.455192 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:28.455587 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:28.455296 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret podName:e48bd3bb-a360-42e2-bee7-064799310567 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:30.455255195 +0000 UTC m=+13.147264114 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret") pod "global-pull-secret-syncer-9d99g" (UID: "e48bd3bb-a360-42e2-bee7-064799310567") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:28.958505 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:28.958381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:28.958701 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:28.958523 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:29.958703 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:29.958662 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:29.959155 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:29.958790 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:29.959155 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:29.958859 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:29.959155 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:29.958977 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:30.468671 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:30.468633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:30.468832 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:30.468799 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:30.468905 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:30.468865 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret podName:e48bd3bb-a360-42e2-bee7-064799310567 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:34.468850041 +0000 UTC m=+17.160858957 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret") pod "global-pull-secret-syncer-9d99g" (UID: "e48bd3bb-a360-42e2-bee7-064799310567") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:30.959040 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:30.958951 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:30.959476 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:30.959087 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:31.958138 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:31.958107 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:31.958138 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:31.958117 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:31.958382 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:31.958214 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:31.958437 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:31.958387 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:32.958073 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:32.958030 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:32.958466 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:32.958159 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:33.958405 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:33.958360 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:33.958871 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:33.958381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:33.958871 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:33.958496 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:33.958871 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:33.958574 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:34.496761 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:34.496718 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:34.496959 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:34.496868 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:34.496959 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:34.496930 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret podName:e48bd3bb-a360-42e2-bee7-064799310567 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:42.496914796 +0000 UTC m=+25.188923716 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret") pod "global-pull-secret-syncer-9d99g" (UID: "e48bd3bb-a360-42e2-bee7-064799310567") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:34.958705 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:34.958667 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:34.959139 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:34.958797 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:35.604514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:35.604469 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:35.604705 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.604632 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:35.604705 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.604693 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:25:51.604674982 +0000 UTC m=+34.296683904 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:35.705268 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:35.705234 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:35.705425 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.705406 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:35.705470 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.705429 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:35.705470 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.705442 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:35.705572 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.705497 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:51.705481998 +0000 UTC m=+34.397490914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:35.958068 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:35.958031 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:35.958218 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.958146 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:35.958298 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:35.958230 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:35.958390 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:35.958356 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:36.958147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:36.958111 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:36.958514 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:36.958257 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:37.148131 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:37.148098 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f14825d_bc77_4c61_9be4_8a25e8c7134b.slice/crio-0b1cc10d457e2f4e8176f42ef9d5b877dce402ffbebca2b313b3677ea2a2df19 WatchSource:0}: Error finding container 0b1cc10d457e2f4e8176f42ef9d5b877dce402ffbebca2b313b3677ea2a2df19: Status 404 returned error can't find the container with id 0b1cc10d457e2f4e8176f42ef9d5b877dce402ffbebca2b313b3677ea2a2df19 Apr 20 19:25:37.960133 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:37.959862 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:37.960952 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:37.959925 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:37.960952 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:37.960176 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:37.960952 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:37.960223 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:38.031951 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.031753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jfcc7" event={"ID":"a587ed5a-1c69-439a-a673-9e3e5479ec27","Type":"ContainerStarted","Data":"295559f3c697588eeab7805a79e37c4fdb642860b44e8e357fd1bb65d2178276"} Apr 20 19:25:38.033337 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.033308 2570 generic.go:358] "Generic (PLEG): container finished" podID="414b2f51-6741-47fc-9528-f08b5a635fba" containerID="fdea862613c7056407721c94bcd73aa6dff6bd78f72b12ffd606dbed2ee9965a" exitCode=0 Apr 20 19:25:38.033461 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.033397 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerDied","Data":"fdea862613c7056407721c94bcd73aa6dff6bd78f72b12ffd606dbed2ee9965a"} Apr 20 19:25:38.035065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.034957 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-s2wnw" event={"ID":"7b91853c-4ad1-4d29-b26c-3742343d8630","Type":"ContainerStarted","Data":"5492adfe587835bb06763193e9323688df9559237455636919e3c2db5d55cc0a"} Apr 20 19:25:38.036575 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.036539 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" event={"ID":"fac78af9-e092-4651-832b-71685148f129","Type":"ContainerStarted","Data":"524d910f6749fb30a159ee08cca0a452271f476efc251f153d6fab8af0aeb077"} Apr 20 19:25:38.038405 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.038364 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d2857" event={"ID":"d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7","Type":"ContainerStarted","Data":"b40ffab0c51db92796b9284123cf67727e2eab9d0797ee7e5281692b37d6aac7"} Apr 20 19:25:38.039811 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.039778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" event={"ID":"52436de9-f8c9-44ae-b773-c086491e71ad","Type":"ContainerStarted","Data":"20f9a5388843710b4e892a4c43ad0a92dd5eda33682114a3f4099d57fe1382ce"} Apr 20 19:25:38.041855 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.041823 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2b57t" event={"ID":"1f14825d-bc77-4c61-9be4-8a25e8c7134b","Type":"ContainerStarted","Data":"64f29bc584b8c740d0b1c9844c698ecb992f4d71f36010663170a1c80c6ded78"} Apr 20 19:25:38.041953 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.041860 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-2b57t" event={"ID":"1f14825d-bc77-4c61-9be4-8a25e8c7134b","Type":"ContainerStarted","Data":"0b1cc10d457e2f4e8176f42ef9d5b877dce402ffbebca2b313b3677ea2a2df19"} Apr 20 19:25:38.044103 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.044061 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:25:38.044441 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.044420 2570 generic.go:358] "Generic (PLEG): container finished" podID="0481b01c-d63f-4503-aacf-fcdb030d79e9" containerID="a212d46bce2340bd19ea658912ad858c7f63084b15d12dbbb869f1d0a4cc5f9e" exitCode=1 Apr 20 19:25:38.044517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.044454 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerDied","Data":"a212d46bce2340bd19ea658912ad858c7f63084b15d12dbbb869f1d0a4cc5f9e"} Apr 20 19:25:38.044517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.044478 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"07eeb6c7ed8410bb5da7b56d50b1d4409ae5673d0ea0b4ba25d37edf4ee63a77"} Apr 20 19:25:38.063810 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.063773 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-2b57t" podStartSLOduration=15.063761363 podStartE2EDuration="15.063761363s" podCreationTimestamp="2026-04-20 19:25:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:25:38.063540954 +0000 UTC m=+20.755549893" watchObservedRunningTime="2026-04-20 19:25:38.063761363 +0000 UTC m=+20.755770302" Apr 20 19:25:38.063977 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.063957 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jfcc7" podStartSLOduration=3.550138846 podStartE2EDuration="20.063952867s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.703180789 +0000 UTC m=+3.395189716" lastFinishedPulling="2026-04-20 19:25:37.216994819 +0000 UTC m=+19.909003737" observedRunningTime="2026-04-20 19:25:38.049549626 +0000 UTC m=+20.741558567" watchObservedRunningTime="2026-04-20 19:25:38.063952867 +0000 UTC m=+20.755961806" Apr 20 19:25:38.082664 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.081322 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-s2wnw" podStartSLOduration=3.597813582 podStartE2EDuration="20.081304591s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.696186381 +0000 UTC m=+3.388195315" lastFinishedPulling="2026-04-20 19:25:37.179677389 +0000 UTC m=+19.871686324" observedRunningTime="2026-04-20 19:25:38.080987899 +0000 UTC m=+20.772996833" watchObservedRunningTime="2026-04-20 19:25:38.081304591 +0000 UTC m=+20.773313522" Apr 20 19:25:38.094405 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.094353 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d2857" podStartSLOduration=8.001188921 podStartE2EDuration="20.09433986s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.702159726 +0000 UTC m=+3.394168643" lastFinishedPulling="2026-04-20 19:25:32.795310662 +0000 UTC m=+15.487319582" observedRunningTime="2026-04-20 19:25:38.093770591 +0000 UTC m=+20.785779540" watchObservedRunningTime="2026-04-20 19:25:38.09433986 +0000 UTC m=+20.786348799" Apr 20 19:25:38.158065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.158022 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-tnmw5" podStartSLOduration=3.6804710370000002 podStartE2EDuration="20.158007743s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.704168246 +0000 UTC m=+3.396177164" lastFinishedPulling="2026-04-20 19:25:37.181704936 +0000 UTC m=+19.873713870" observedRunningTime="2026-04-20 19:25:38.157434596 +0000 UTC m=+20.849443535" watchObservedRunningTime="2026-04-20 19:25:38.158007743 +0000 UTC m=+20.850016682" Apr 20 19:25:38.884337 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.884313 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 19:25:38.921722 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.921625 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T19:25:38.884333578Z","UUID":"a5e02850-3508-4463-9a76-9257875b95f7","Handler":null,"Name":"","Endpoint":""} Apr 20 19:25:38.925091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.925065 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 19:25:38.925091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.925098 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 19:25:38.958724 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:38.958694 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:38.958875 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:38.958807 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:39.049455 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.049372 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:25:39.049876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.049778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"1a150d220593d40276d070ab028eaa52f4800de122aedaa08628f8387f991ce4"} Apr 20 19:25:39.049876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.049814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"293efa63747482d9debf7bfa7b929dbc0787a308d712369f7f9ac3e41eb32e54"} Apr 20 19:25:39.049876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.049829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"0cb8fac24cf3b5f6dc6f7a75e0b7e401971d15cc6b5ba9de204893b16cc8af95"} Apr 20 19:25:39.049876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.049841 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"48bd1eec005291bb946aac0b62d40624420a7b21382a7f642a0908e7b6958d7d"} Apr 20 19:25:39.051480 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.051457 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" event={"ID":"52436de9-f8c9-44ae-b773-c086491e71ad","Type":"ContainerStarted","Data":"5027b35842a8a246036b67b9a736b2652379416733f76d1ed1c65c14029f9c4c"} Apr 20 19:25:39.052957 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.052882 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-v9jk7" event={"ID":"5a9f2bb3-2690-4fd5-a45a-562b7167cd70","Type":"ContainerStarted","Data":"4816773d22ae056127bb3822c806362e11d9fdcd9d25eb9f64d72616f7eb608b"} Apr 20 19:25:39.066543 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.066498 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-v9jk7" podStartSLOduration=4.584406125 podStartE2EDuration="21.066484431s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.697048278 +0000 UTC m=+3.389057196" lastFinishedPulling="2026-04-20 19:25:37.179126569 +0000 UTC m=+19.871135502" observedRunningTime="2026-04-20 19:25:39.066473939 +0000 UTC m=+21.758482877" watchObservedRunningTime="2026-04-20 19:25:39.066484431 +0000 UTC m=+21.758493367" Apr 20 19:25:39.958606 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.958344 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:39.958778 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:39.958465 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:39.958778 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:39.958737 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:39.958885 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:39.958799 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:40.057049 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:40.056970 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" event={"ID":"52436de9-f8c9-44ae-b773-c086491e71ad","Type":"ContainerStarted","Data":"12799c3cdb0e9b252ea4fe8c7e644e43f85f419da7998d3f9a56710c99a2c1ea"} Apr 20 19:25:40.087936 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:40.087872 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-6vmcw" podStartSLOduration=2.909321709 podStartE2EDuration="22.087854231s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.699973698 +0000 UTC m=+3.391982619" lastFinishedPulling="2026-04-20 19:25:39.87850621 +0000 UTC m=+22.570515141" observedRunningTime="2026-04-20 19:25:40.087553236 +0000 UTC m=+22.779562200" watchObservedRunningTime="2026-04-20 19:25:40.087854231 +0000 UTC m=+22.779863170" Apr 20 19:25:40.958419 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:40.958384 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:40.958666 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:40.958535 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:41.062207 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:41.062176 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:25:41.062810 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:41.062574 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"9934395bccf3e8b171e00f19c58a2fe91a9cbc87549d4236843857cd05f29d5d"} Apr 20 19:25:41.580216 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:41.580180 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:41.580928 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:41.580909 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:41.958743 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:41.958708 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:41.958927 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:41.958754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:41.958927 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:41.958838 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:41.959052 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:41.958982 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:42.558627 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:42.558593 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:42.559089 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:42.558724 2570 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:42.559089 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:42.558803 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret podName:e48bd3bb-a360-42e2-bee7-064799310567 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:58.558782866 +0000 UTC m=+41.250791807 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret") pod "global-pull-secret-syncer-9d99g" (UID: "e48bd3bb-a360-42e2-bee7-064799310567") : object "kube-system"/"original-pull-secret" not registered Apr 20 19:25:42.958897 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:42.958553 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:42.959204 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:42.958958 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:43.043612 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.043581 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:43.044281 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.044263 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-s2wnw" Apr 20 19:25:43.069714 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.068988 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:25:43.070199 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.070167 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"5283cf2a092789373ddafc11d4a0da84ac7c4610f0eac126c188bc34ae944c3b"} Apr 20 19:25:43.070549 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.070523 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:43.070782 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.070762 2570 scope.go:117] "RemoveContainer" containerID="a212d46bce2340bd19ea658912ad858c7f63084b15d12dbbb869f1d0a4cc5f9e" Apr 20 19:25:43.071922 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.071897 2570 generic.go:358] "Generic (PLEG): container finished" podID="414b2f51-6741-47fc-9528-f08b5a635fba" containerID="e33f1d40f272ff00ae2b7738eea762564deb4a327dfffd696c6dedee1748d554" exitCode=0 Apr 20 19:25:43.072034 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.071963 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerDied","Data":"e33f1d40f272ff00ae2b7738eea762564deb4a327dfffd696c6dedee1748d554"} Apr 20 19:25:43.087332 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.087251 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:43.958214 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.958186 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:43.958752 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:43.958287 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:43.958752 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:43.958362 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:43.958752 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:43.958445 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:44.076840 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.076815 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:25:44.077190 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.077170 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" event={"ID":"0481b01c-d63f-4503-aacf-fcdb030d79e9","Type":"ContainerStarted","Data":"aa138a858f28b47b227ff1046f068e124fe31328c4f7047168dffa146058d8a9"} Apr 20 19:25:44.077277 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.077261 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:25:44.077585 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.077569 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:44.092266 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.092239 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:44.126222 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.126173 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" podStartSLOduration=9.21512417 podStartE2EDuration="26.126158578s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.706753233 +0000 UTC m=+3.398762150" lastFinishedPulling="2026-04-20 19:25:37.617787627 +0000 UTC m=+20.309796558" observedRunningTime="2026-04-20 19:25:44.12613979 +0000 UTC m=+26.818148740" watchObservedRunningTime="2026-04-20 19:25:44.126158578 +0000 UTC m=+26.818167514" Apr 20 19:25:44.958686 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:44.958658 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:44.959037 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:44.958764 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:45.080537 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.080501 2570 generic.go:358] "Generic (PLEG): container finished" podID="414b2f51-6741-47fc-9528-f08b5a635fba" containerID="3c2a193e42186a7345ebd77968f9ed27f0ada45947986dc7444052e7ab11cedd" exitCode=0 Apr 20 19:25:45.080709 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.080594 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerDied","Data":"3c2a193e42186a7345ebd77968f9ed27f0ada45947986dc7444052e7ab11cedd"} Apr 20 19:25:45.080836 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.080822 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:25:45.142380 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.142344 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cvbdc"] Apr 20 19:25:45.142542 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.142469 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:45.142627 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:45.142606 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:45.152700 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.152669 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bnw57"] Apr 20 19:25:45.152836 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.152774 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:45.152871 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:45.152850 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:45.162410 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.162383 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9d99g"] Apr 20 19:25:45.162509 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:45.162492 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:45.162601 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:45.162585 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:46.094792 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:46.094657 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:25:46.958646 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:46.958429 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:46.958808 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:46.958485 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:46.958808 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:46.958722 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:46.958808 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:46.958491 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:46.958964 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:46.958793 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:46.958964 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:46.958914 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:47.085979 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:47.085943 2570 generic.go:358] "Generic (PLEG): container finished" podID="414b2f51-6741-47fc-9528-f08b5a635fba" containerID="03e46ff83567a17be0a10eac1e56264e710762590ce676528a9a67772d89474a" exitCode=0 Apr 20 19:25:47.086117 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:47.085992 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerDied","Data":"03e46ff83567a17be0a10eac1e56264e710762590ce676528a9a67772d89474a"} Apr 20 19:25:47.549552 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:47.549517 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:25:47.550318 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:47.550285 2570 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 20 19:25:47.566159 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:47.566112 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" podUID="0481b01c-d63f-4503-aacf-fcdb030d79e9" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 19:25:47.575461 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:47.575430 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" podUID="0481b01c-d63f-4503-aacf-fcdb030d79e9" containerName="ovnkube-controller" probeResult="failure" output="" Apr 20 19:25:48.958849 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:48.958811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:48.959489 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:48.958811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:48.959489 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:48.958959 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:48.959489 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:48.958811 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:48.959489 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:48.959014 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:48.959489 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:48.959080 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:50.958390 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:50.958354 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:50.958390 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:50.958395 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:50.959008 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:50.958354 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:50.959008 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:50.958489 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9d99g" podUID="e48bd3bb-a360-42e2-bee7-064799310567" Apr 20 19:25:50.959008 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:50.958586 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:25:50.959008 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:50.958684 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-bnw57" podUID="045e6638-b662-4970-88a9-8a02de6ca547" Apr 20 19:25:51.625731 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.625672 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:51.625944 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.625868 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:51.626012 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.625962 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:26:23.62593985 +0000 UTC m=+66.317948767 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 19:25:51.645795 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.645766 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-159.ec2.internal" event="NodeReady" Apr 20 19:25:51.645945 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.645918 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 19:25:51.693629 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.693595 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-54bd76d48b-p874f"] Apr 20 19:25:51.716599 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.716555 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7"] Apr 20 19:25:51.716751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.716674 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.719800 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.719778 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 19:25:51.719800 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.719793 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 19:25:51.719961 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.719893 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8vchn\"" Apr 20 19:25:51.719999 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.719967 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 19:25:51.726131 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.726047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:51.726518 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.726178 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 19:25:51.726518 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.726198 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 19:25:51.726518 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.726212 2570 projected.go:194] Error preparing data for projected volume kube-api-access-cx8f6 for pod openshift-network-diagnostics/network-check-target-bnw57: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:51.726518 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.726265 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6 podName:045e6638-b662-4970-88a9-8a02de6ca547 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:23.726248378 +0000 UTC m=+66.418257296 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cx8f6" (UniqueName: "kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6") pod "network-check-target-bnw57" (UID: "045e6638-b662-4970-88a9-8a02de6ca547") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 19:25:51.730619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.730593 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7"] Apr 20 19:25:51.730799 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.730627 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tf6mk"] Apr 20 19:25:51.731764 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.731505 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 19:25:51.751769 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.751742 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54bd76d48b-p874f"] Apr 20 19:25:51.751769 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.751773 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-s4kbs"] Apr 20 19:25:51.751958 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.751879 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:51.752046 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.752019 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.754997 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.754979 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 19:25:51.755158 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.754996 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 20 19:25:51.755238 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.755159 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hptb4\"" Apr 20 19:25:51.755485 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.755469 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5vnpt\"" Apr 20 19:25:51.757629 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.757608 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 20 19:25:51.760873 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.760765 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 19:25:51.767280 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.767259 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tf6mk"] Apr 20 19:25:51.767280 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.767292 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s4kbs"] Apr 20 19:25:51.767418 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.767382 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:51.775577 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.775540 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 19:25:51.775661 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.775546 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 19:25:51.775965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.775950 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 19:25:51.778402 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.778385 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbf4z\"" Apr 20 19:25:51.826682 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826636 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:51.826682 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826681 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-installation-pull-secrets\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.826915 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-certificates\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.826915 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826806 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24fgm\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-kube-api-access-24fgm\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.826915 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826852 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-trusted-ca\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.826915 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826882 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-ca-trust-extracted\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.826915 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826909 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-config-volume\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.827121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.826972 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-image-registry-private-configuration\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.827121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.827004 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.827121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.827030 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-bound-sa-token\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.827121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.827057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-tmp-dir\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.827121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.827088 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/667e99fd-c507-4e05-a425-bda15ee82168-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:51.827121 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.827114 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct2hs\" (UniqueName: \"kubernetes.io/projected/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-kube-api-access-ct2hs\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.827347 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.827142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.928500 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928409 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:51.928500 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928455 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-installation-pull-secrets\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.928500 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928490 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6d6mx\" (UniqueName: \"kubernetes.io/projected/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-kube-api-access-6d6mx\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928527 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-certificates\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928554 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-24fgm\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-kube-api-access-24fgm\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.928593 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928649 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-trusted-ca\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.928670 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:52.428650726 +0000 UTC m=+35.120659665 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-ca-trust-extracted\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928732 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-config-volume\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.928785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928786 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-image-registry-private-configuration\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928815 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928838 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-bound-sa-token\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928859 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-tmp-dir\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/667e99fd-c507-4e05-a425-bda15ee82168-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928900 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct2hs\" (UniqueName: \"kubernetes.io/projected/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-kube-api-access-ct2hs\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928918 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.928944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.929091 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.929056 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:51.929440 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.929100 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:52.429084638 +0000 UTC m=+35.121093555 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:25:51.929440 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.929310 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-tmp-dir\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.929552 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.929503 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:25:51.929552 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.929522 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:25:51.929687 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.929583 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-certificates\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.929687 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:51.929602 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:25:52.429584112 +0000 UTC m=+35.121593036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:25:51.929687 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.929588 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-config-volume\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.929839 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.929811 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-ca-trust-extracted\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.929883 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.929841 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-trusted-ca\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.929949 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.929875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/667e99fd-c507-4e05-a425-bda15ee82168-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:51.934529 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.934506 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-image-registry-private-configuration\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.934679 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.934508 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-installation-pull-secrets\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.940911 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.940887 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-24fgm\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-kube-api-access-24fgm\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:51.944574 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.944534 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct2hs\" (UniqueName: \"kubernetes.io/projected/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-kube-api-access-ct2hs\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:51.948009 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:51.947975 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-bound-sa-token\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:52.029363 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.029322 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:52.030001 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.029428 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6d6mx\" (UniqueName: \"kubernetes.io/projected/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-kube-api-access-6d6mx\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:52.030001 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.029484 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:52.030001 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.029578 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:52.529537419 +0000 UTC m=+35.221546342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:25:52.043313 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.043281 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6d6mx\" (UniqueName: \"kubernetes.io/projected/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-kube-api-access-6d6mx\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:52.433638 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.433596 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:52.433848 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.433694 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:52.433848 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.433729 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:52.433848 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433748 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:25:52.433848 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433828 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:53.433806751 +0000 UTC m=+36.125815711 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:25:52.433848 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433834 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:52.433848 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433841 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:25:52.434109 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433861 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:25:52.434109 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433884 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:53.433868719 +0000 UTC m=+36.125877657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:25:52.434109 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.433912 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:25:53.433889899 +0000 UTC m=+36.125898830 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:25:52.534316 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.534275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:52.534502 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.534441 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:52.534590 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:52.534516 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:53.534496444 +0000 UTC m=+36.226505381 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:25:52.958478 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.958320 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:25:52.958648 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.958320 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:25:52.958706 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.958320 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:52.962907 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.962883 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:25:52.963075 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.962909 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 19:25:52.963075 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.962909 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:25:52.963075 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.962909 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8fpn\"" Apr 20 19:25:52.963075 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.962964 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-89kqx\"" Apr 20 19:25:52.963075 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:52.963025 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:25:53.099576 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:53.099508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerStarted","Data":"b9a024f16e0aa56151ad76e0ae9e37a64025ae3d053f21a14c73f83af00473dc"} Apr 20 19:25:53.442147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:53.442055 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:53.442147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:53.442125 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:53.442384 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442203 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:53.442384 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442259 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:55.44224501 +0000 UTC m=+38.134253927 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:25:53.442384 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:53.442288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:53.442384 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442289 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:25:53.442384 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442371 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:55.442358815 +0000 UTC m=+38.134367748 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:25:53.442677 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442388 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:25:53.442677 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442410 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:25:53.442677 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.442475 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:25:55.442460399 +0000 UTC m=+38.134469315 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:25:53.542607 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:53.542549 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:53.542798 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.542696 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:53.542798 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:53.542760 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:55.542745493 +0000 UTC m=+38.234754411 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:25:54.103622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:54.103592 2570 generic.go:358] "Generic (PLEG): container finished" podID="414b2f51-6741-47fc-9528-f08b5a635fba" containerID="b9a024f16e0aa56151ad76e0ae9e37a64025ae3d053f21a14c73f83af00473dc" exitCode=0 Apr 20 19:25:54.104163 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:54.103648 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerDied","Data":"b9a024f16e0aa56151ad76e0ae9e37a64025ae3d053f21a14c73f83af00473dc"} Apr 20 19:25:55.108809 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:55.108777 2570 generic.go:358] "Generic (PLEG): container finished" podID="414b2f51-6741-47fc-9528-f08b5a635fba" containerID="794dc1d3748aaf46a48f5440142d466cf391e556bab5b63cf11bee874d5189fe" exitCode=0 Apr 20 19:25:55.109161 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:55.108832 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerDied","Data":"794dc1d3748aaf46a48f5440142d466cf391e556bab5b63cf11bee874d5189fe"} Apr 20 19:25:55.458625 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:55.458597 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:55.458789 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:55.458644 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:55.458789 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:55.458680 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:55.458789 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458734 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:25:55.458789 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458754 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:25:55.458962 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458799 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.458785793 +0000 UTC m=+42.150794709 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:25:55.458962 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458815 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:55.458962 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458753 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:25:55.458962 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458869 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.458855832 +0000 UTC m=+42.150864748 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:25:55.458962 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.458882 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.458876426 +0000 UTC m=+42.150885343 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:25:55.559645 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:55.559610 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:55.559778 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.559755 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:55.559845 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:55.559834 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:25:59.559820796 +0000 UTC m=+42.251829714 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:25:56.113403 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:56.113368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" event={"ID":"414b2f51-6741-47fc-9528-f08b5a635fba","Type":"ContainerStarted","Data":"faacc8d450cf6045b6c37dc7eb01cd398e71f4edadda1432250525c95b2f47d3"} Apr 20 19:25:56.139356 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:56.139304 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-p5rrh" podStartSLOduration=5.978029368 podStartE2EDuration="38.139290647s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:25:20.698574743 +0000 UTC m=+3.390583674" lastFinishedPulling="2026-04-20 19:25:52.859836031 +0000 UTC m=+35.551844953" observedRunningTime="2026-04-20 19:25:56.138225524 +0000 UTC m=+38.830234463" watchObservedRunningTime="2026-04-20 19:25:56.139290647 +0000 UTC m=+38.831299587" Apr 20 19:25:57.635371 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.635336 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9"] Apr 20 19:25:57.638138 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.638117 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.638418 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.638295 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh"] Apr 20 19:25:57.640467 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.640445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 19:25:57.640552 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.640479 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 19:25:57.640993 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.640980 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.641438 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.641422 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-d5sm2\"" Apr 20 19:25:57.641978 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.641787 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 19:25:57.641978 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.641873 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 20 19:25:57.642635 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.642617 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t"] Apr 20 19:25:57.642921 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.642905 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 19:25:57.645591 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.645361 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.647905 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.647888 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 19:25:57.647971 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.647928 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 19:25:57.647971 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.647944 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 19:25:57.647971 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.647893 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 19:25:57.652615 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.652582 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9"] Apr 20 19:25:57.653397 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.653379 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh"] Apr 20 19:25:57.655507 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.655491 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t"] Apr 20 19:25:57.775771 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.775716 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptpxb\" (UniqueName: \"kubernetes.io/projected/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-kube-api-access-ptpxb\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.775771 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.775770 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-tmp\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.775771 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.775790 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kzn4\" (UniqueName: \"kubernetes.io/projected/25969649-2fe5-49c7-afb9-3559488fc423-kube-api-access-5kzn4\") pod \"managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9\" (UID: \"25969649-2fe5-49c7-afb9-3559488fc423\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.776091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.775868 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-ca\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.776091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.775903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.776091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.775962 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25969649-2fe5-49c7-afb9-3559488fc423-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9\" (UID: \"25969649-2fe5-49c7-afb9-3559488fc423\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.776091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.776066 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.776282 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.776107 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-hub\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.776282 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.776133 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.776282 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.776206 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldr2r\" (UniqueName: \"kubernetes.io/projected/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-kube-api-access-ldr2r\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.776282 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.776239 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.876972 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.876939 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldr2r\" (UniqueName: \"kubernetes.io/projected/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-kube-api-access-ldr2r\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.877118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.876977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.877118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877000 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ptpxb\" (UniqueName: \"kubernetes.io/projected/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-kube-api-access-ptpxb\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.877118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877019 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-tmp\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.877118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kzn4\" (UniqueName: \"kubernetes.io/projected/25969649-2fe5-49c7-afb9-3559488fc423-kube-api-access-5kzn4\") pod \"managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9\" (UID: \"25969649-2fe5-49c7-afb9-3559488fc423\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.877118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877085 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-ca\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.877118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877109 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.877401 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877139 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25969649-2fe5-49c7-afb9-3559488fc423-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9\" (UID: \"25969649-2fe5-49c7-afb9-3559488fc423\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.877457 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.877506 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877464 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-tmp\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.877506 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-hub\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.877506 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877500 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.877829 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.877797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.881431 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.881401 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.881520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.881460 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-hub\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.881520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.881477 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.881520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.881483 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-klusterlet-config\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.881520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.881507 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-ca\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.881688 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.881645 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/25969649-2fe5-49c7-afb9-3559488fc423-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9\" (UID: \"25969649-2fe5-49c7-afb9-3559488fc423\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.885175 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.885149 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldr2r\" (UniqueName: \"kubernetes.io/projected/f5c6d6e2-e6b9-4777-981d-56a8a97d3208-kube-api-access-ldr2r\") pod \"klusterlet-addon-workmgr-5b4d8dfcd-p5dsh\" (UID: \"f5c6d6e2-e6b9-4777-981d-56a8a97d3208\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.885493 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.885444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ptpxb\" (UniqueName: \"kubernetes.io/projected/f1d91da9-119a-49fe-968a-bcf1a2a01e5e-kube-api-access-ptpxb\") pod \"cluster-proxy-proxy-agent-867d5dddf7-bkp7t\" (UID: \"f1d91da9-119a-49fe-968a-bcf1a2a01e5e\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:57.885552 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.885495 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kzn4\" (UniqueName: \"kubernetes.io/projected/25969649-2fe5-49c7-afb9-3559488fc423-kube-api-access-5kzn4\") pod \"managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9\" (UID: \"25969649-2fe5-49c7-afb9-3559488fc423\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.966865 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.966835 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" Apr 20 19:25:57.972511 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.972487 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:25:57.977173 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:57.977149 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:25:58.134868 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.134841 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t"] Apr 20 19:25:58.137956 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:58.137929 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf1d91da9_119a_49fe_968a_bcf1a2a01e5e.slice/crio-e548783bf180ce4718b1ee852779c57ba8ac8077eb5f4d9a630900e5bdfb4543 WatchSource:0}: Error finding container e548783bf180ce4718b1ee852779c57ba8ac8077eb5f4d9a630900e5bdfb4543: Status 404 returned error can't find the container with id e548783bf180ce4718b1ee852779c57ba8ac8077eb5f4d9a630900e5bdfb4543 Apr 20 19:25:58.353155 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.353123 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9"] Apr 20 19:25:58.355402 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.355378 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh"] Apr 20 19:25:58.356799 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:58.356749 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod25969649_2fe5_49c7_afb9_3559488fc423.slice/crio-357bf713423ef686b55dc1c845af52b0d36366e59d2302ecbe032705f00eee5f WatchSource:0}: Error finding container 357bf713423ef686b55dc1c845af52b0d36366e59d2302ecbe032705f00eee5f: Status 404 returned error can't find the container with id 357bf713423ef686b55dc1c845af52b0d36366e59d2302ecbe032705f00eee5f Apr 20 19:25:58.359470 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:58.359445 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5c6d6e2_e6b9_4777_981d_56a8a97d3208.slice/crio-aec7d6db9ef56727269909e0ba165f4e81bc7b14ac8b093f5573c3eae91a7401 WatchSource:0}: Error finding container aec7d6db9ef56727269909e0ba165f4e81bc7b14ac8b093f5573c3eae91a7401: Status 404 returned error can't find the container with id aec7d6db9ef56727269909e0ba165f4e81bc7b14ac8b093f5573c3eae91a7401 Apr 20 19:25:58.583957 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.583923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:58.587088 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.587052 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/e48bd3bb-a360-42e2-bee7-064799310567-original-pull-secret\") pod \"global-pull-secret-syncer-9d99g\" (UID: \"e48bd3bb-a360-42e2-bee7-064799310567\") " pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:58.689923 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.689878 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9d99g" Apr 20 19:25:58.820418 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:58.820388 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9d99g"] Apr 20 19:25:58.824060 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:25:58.824027 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode48bd3bb_a360_42e2_bee7_064799310567.slice/crio-75633a2840a34459aa3bbc0f06c701924e3c0b3a4043712094e89a799a0eb0bc WatchSource:0}: Error finding container 75633a2840a34459aa3bbc0f06c701924e3c0b3a4043712094e89a799a0eb0bc: Status 404 returned error can't find the container with id 75633a2840a34459aa3bbc0f06c701924e3c0b3a4043712094e89a799a0eb0bc Apr 20 19:25:59.122650 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.122608 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9d99g" event={"ID":"e48bd3bb-a360-42e2-bee7-064799310567","Type":"ContainerStarted","Data":"75633a2840a34459aa3bbc0f06c701924e3c0b3a4043712094e89a799a0eb0bc"} Apr 20 19:25:59.124056 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.124027 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" event={"ID":"f1d91da9-119a-49fe-968a-bcf1a2a01e5e","Type":"ContainerStarted","Data":"e548783bf180ce4718b1ee852779c57ba8ac8077eb5f4d9a630900e5bdfb4543"} Apr 20 19:25:59.125436 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.125409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" event={"ID":"f5c6d6e2-e6b9-4777-981d-56a8a97d3208","Type":"ContainerStarted","Data":"aec7d6db9ef56727269909e0ba165f4e81bc7b14ac8b093f5573c3eae91a7401"} Apr 20 19:25:59.127187 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.127162 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" event={"ID":"25969649-2fe5-49c7-afb9-3559488fc423","Type":"ContainerStarted","Data":"357bf713423ef686b55dc1c845af52b0d36366e59d2302ecbe032705f00eee5f"} Apr 20 19:25:59.492619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.492576 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:25:59.492813 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.492650 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:25:59.492813 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.492681 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:25:59.492934 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.492856 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:25:59.492934 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.492920 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:07.492900571 +0000 UTC m=+50.184909495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:25:59.493544 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.493353 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:25:59.493544 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.493374 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:25:59.493544 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.493420 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:26:07.493405455 +0000 UTC m=+50.185414386 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:25:59.493544 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.493484 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:25:59.493544 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.493516 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:07.493504161 +0000 UTC m=+50.185513084 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:25:59.594959 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:25:59.594196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:25:59.594959 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.594457 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:25:59.594959 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:25:59.594573 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:07.594538563 +0000 UTC m=+50.286547495 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:26:05.141912 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.141865 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" event={"ID":"f5c6d6e2-e6b9-4777-981d-56a8a97d3208","Type":"ContainerStarted","Data":"c48316f4d78ee39a7af28004b86543f85b9997f0378af7a58f3e716025d2ae98"} Apr 20 19:26:05.142630 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.142603 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:26:05.143368 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.143339 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" event={"ID":"25969649-2fe5-49c7-afb9-3559488fc423","Type":"ContainerStarted","Data":"6f6325ae6e6334d8f73a0020e4c5947feed107a0eb7f0c07f9874e03bb5a92e3"} Apr 20 19:26:05.143935 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.143916 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:26:05.144775 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.144754 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9d99g" event={"ID":"e48bd3bb-a360-42e2-bee7-064799310567","Type":"ContainerStarted","Data":"ee030a2ec6b01a01301c2b63e7067181140fe859dacd401e53c8d9727be48913"} Apr 20 19:26:05.145854 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.145836 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" event={"ID":"f1d91da9-119a-49fe-968a-bcf1a2a01e5e","Type":"ContainerStarted","Data":"2b0d19f9b1ce2b5e7cca0ccfafca7c4dd0defccaee1c1b1526aa4a9f4d08d746"} Apr 20 19:26:05.168316 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.168265 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" podStartSLOduration=1.8150107640000002 podStartE2EDuration="8.168252382s" podCreationTimestamp="2026-04-20 19:25:57 +0000 UTC" firstStartedPulling="2026-04-20 19:25:58.361007255 +0000 UTC m=+41.053016172" lastFinishedPulling="2026-04-20 19:26:04.714248861 +0000 UTC m=+47.406257790" observedRunningTime="2026-04-20 19:26:05.167636129 +0000 UTC m=+47.859645067" watchObservedRunningTime="2026-04-20 19:26:05.168252382 +0000 UTC m=+47.860261321" Apr 20 19:26:05.192803 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.192754 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9d99g" podStartSLOduration=33.31170188 podStartE2EDuration="39.19273973s" podCreationTimestamp="2026-04-20 19:25:26 +0000 UTC" firstStartedPulling="2026-04-20 19:25:58.826121 +0000 UTC m=+41.518129924" lastFinishedPulling="2026-04-20 19:26:04.70715884 +0000 UTC m=+47.399167774" observedRunningTime="2026-04-20 19:26:05.191663071 +0000 UTC m=+47.883672010" watchObservedRunningTime="2026-04-20 19:26:05.19273973 +0000 UTC m=+47.884748670" Apr 20 19:26:05.221525 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:05.221476 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" podStartSLOduration=1.888579332 podStartE2EDuration="8.221460952s" podCreationTimestamp="2026-04-20 19:25:57 +0000 UTC" firstStartedPulling="2026-04-20 19:25:58.358627306 +0000 UTC m=+41.050636228" lastFinishedPulling="2026-04-20 19:26:04.691508928 +0000 UTC m=+47.383517848" observedRunningTime="2026-04-20 19:26:05.217840646 +0000 UTC m=+47.909849584" watchObservedRunningTime="2026-04-20 19:26:05.221460952 +0000 UTC m=+47.913469890" Apr 20 19:26:07.570346 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:07.570250 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:26:07.570346 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:07.570303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:26:07.570346 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:07.570324 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570362 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570390 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570412 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570455 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:26:23.570438313 +0000 UTC m=+66.262447233 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570458 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570471 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:23.570464137 +0000 UTC m=+66.262473054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:26:07.570862 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.570518 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:23.570498385 +0000 UTC m=+66.262507313 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:26:07.671510 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:07.671474 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:26:07.671698 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.671601 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:07.671698 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:07.671651 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:23.671638404 +0000 UTC m=+66.363647321 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:26:08.156875 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:08.156837 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" event={"ID":"f1d91da9-119a-49fe-968a-bcf1a2a01e5e","Type":"ContainerStarted","Data":"15b6fde7adf9f9bf71255967bf56fcff501ad9376676247c0126da65661c1a1a"} Apr 20 19:26:08.156875 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:08.156875 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" event={"ID":"f1d91da9-119a-49fe-968a-bcf1a2a01e5e","Type":"ContainerStarted","Data":"82c92f3c7b8bea49614dafbbdddda81938686abc2656f57d7f9de3ef22f6cbea"} Apr 20 19:26:08.186445 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:08.186398 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" podStartSLOduration=2.069774281 podStartE2EDuration="11.186384787s" podCreationTimestamp="2026-04-20 19:25:57 +0000 UTC" firstStartedPulling="2026-04-20 19:25:58.139797518 +0000 UTC m=+40.831806435" lastFinishedPulling="2026-04-20 19:26:07.256408013 +0000 UTC m=+49.948416941" observedRunningTime="2026-04-20 19:26:08.184728486 +0000 UTC m=+50.876737425" watchObservedRunningTime="2026-04-20 19:26:08.186384787 +0000 UTC m=+50.878393726" Apr 20 19:26:17.576006 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:17.575979 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-9rxk8" Apr 20 19:26:23.594347 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.594305 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.594357 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.594379 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594477 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594509 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594483 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594549 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594549 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:26:55.594534972 +0000 UTC m=+98.286543888 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594607 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:55.594594812 +0000 UTC m=+98.286603728 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:26:23.594787 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.594622 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:26:55.594616021 +0000 UTC m=+98.286624938 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:26:23.695674 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.695633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:26:23.695809 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.695687 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:26:23.695809 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.695778 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:23.695887 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.695853 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:26:55.695834613 +0000 UTC m=+98.387843530 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:26:23.702353 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.702334 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 19:26:23.706781 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.706767 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:26:23.706846 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:23.706823 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:27:27.706806197 +0000 UTC m=+130.398815113 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : secret "metrics-daemon-secret" not found Apr 20 19:26:23.797017 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.796977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:26:23.799948 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.799932 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 19:26:23.809774 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.809757 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 19:26:23.819617 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.819596 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8f6\" (UniqueName: \"kubernetes.io/projected/045e6638-b662-4970-88a9-8a02de6ca547-kube-api-access-cx8f6\") pod \"network-check-target-bnw57\" (UID: \"045e6638-b662-4970-88a9-8a02de6ca547\") " pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:26:23.887076 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.887005 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-89kqx\"" Apr 20 19:26:23.894632 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:23.894611 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:26:24.026736 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:24.026704 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-bnw57"] Apr 20 19:26:24.028404 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:26:24.028379 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod045e6638_b662_4970_88a9_8a02de6ca547.slice/crio-fcdc235357629b497131892da445d0b4774b2bf0d1d6a53a051bd2f22c1e1f0d WatchSource:0}: Error finding container fcdc235357629b497131892da445d0b4774b2bf0d1d6a53a051bd2f22c1e1f0d: Status 404 returned error can't find the container with id fcdc235357629b497131892da445d0b4774b2bf0d1d6a53a051bd2f22c1e1f0d Apr 20 19:26:24.196643 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:24.196610 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bnw57" event={"ID":"045e6638-b662-4970-88a9-8a02de6ca547","Type":"ContainerStarted","Data":"fcdc235357629b497131892da445d0b4774b2bf0d1d6a53a051bd2f22c1e1f0d"} Apr 20 19:26:27.206257 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:27.206215 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-bnw57" event={"ID":"045e6638-b662-4970-88a9-8a02de6ca547","Type":"ContainerStarted","Data":"5516700a42751197b6572662a8a42dff4d9299d763c06c4b882b829c054a420b"} Apr 20 19:26:27.206721 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:27.206352 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:26:27.224210 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:27.224161 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-bnw57" podStartSLOduration=66.686143266 podStartE2EDuration="1m9.224145206s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:26:24.030503878 +0000 UTC m=+66.722512794" lastFinishedPulling="2026-04-20 19:26:26.568505813 +0000 UTC m=+69.260514734" observedRunningTime="2026-04-20 19:26:27.223411305 +0000 UTC m=+69.915420244" watchObservedRunningTime="2026-04-20 19:26:27.224145206 +0000 UTC m=+69.916154146" Apr 20 19:26:55.656725 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:55.656669 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:26:55.656725 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:55.656723 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:55.656798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.656827 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.656896 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls podName:1c8d3c86-b40b-483d-8c21-32ed7bd9f45e nodeName:}" failed. No retries permitted until 2026-04-20 19:27:59.656879962 +0000 UTC m=+162.348888880 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls") pod "dns-default-tf6mk" (UID: "1c8d3c86-b40b-483d-8c21-32ed7bd9f45e") : secret "dns-default-metrics-tls" not found Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.656898 2570 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.656912 2570 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-54bd76d48b-p874f: secret "image-registry-tls" not found Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.656897 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.656959 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls podName:f8fe2540-8adc-4eff-9a2b-d4fb05979bbd nodeName:}" failed. No retries permitted until 2026-04-20 19:27:59.656947832 +0000 UTC m=+162.348956749 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls") pod "image-registry-54bd76d48b-p874f" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd") : secret "image-registry-tls" not found Apr 20 19:26:55.657191 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.657027 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:59.657012207 +0000 UTC m=+162.349021127 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:26:55.757866 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:55.757821 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:26:55.757979 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.757948 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 19:26:55.758022 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:26:55.758011 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert podName:cc2bb5fd-130e-4be9-a03f-eb9ac877ee23 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:59.7579944 +0000 UTC m=+162.450003322 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert") pod "ingress-canary-s4kbs" (UID: "cc2bb5fd-130e-4be9-a03f-eb9ac877ee23") : secret "canary-serving-cert" not found Apr 20 19:26:58.211316 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:26:58.211285 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-bnw57" Apr 20 19:27:27.795555 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:27.795498 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:27:27.796078 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:27.795662 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 19:27:27.796078 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:27.795736 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs podName:220904c2-fd82-44e5-9904-33aeca86dcee nodeName:}" failed. No retries permitted until 2026-04-20 19:29:29.795720674 +0000 UTC m=+252.487729591 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs") pod "network-metrics-daemon-cvbdc" (UID: "220904c2-fd82-44e5-9904-33aeca86dcee") : secret "metrics-daemon-secret" not found Apr 20 19:27:39.260448 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:39.260420 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2b57t_1f14825d-bc77-4c61-9be4-8a25e8c7134b/dns-node-resolver/0.log" Apr 20 19:27:40.260100 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:40.260072 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d2857_d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7/node-ca/0.log" Apr 20 19:27:50.890156 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.890120 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4zbff"] Apr 20 19:27:50.893108 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.893089 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:50.895896 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.895872 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 19:27:50.896237 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.896218 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 19:27:50.896309 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.896226 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 19:27:50.896383 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.896364 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 19:27:50.897114 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.897098 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4kb52\"" Apr 20 19:27:50.907960 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.907940 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4zbff"] Apr 20 19:27:50.979544 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.979506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:50.979544 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.979543 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b94a5636-dfff-4c1d-8bb7-149955529401-crio-socket\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:50.979757 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.979674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b94a5636-dfff-4c1d-8bb7-149955529401-data-volume\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:50.979757 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.979709 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b94a5636-dfff-4c1d-8bb7-149955529401-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:50.979837 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:50.979769 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpnx2\" (UniqueName: \"kubernetes.io/projected/b94a5636-dfff-4c1d-8bb7-149955529401-kube-api-access-xpnx2\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.080709 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.080673 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpnx2\" (UniqueName: \"kubernetes.io/projected/b94a5636-dfff-4c1d-8bb7-149955529401-kube-api-access-xpnx2\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.080886 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.080719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.080886 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.080739 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b94a5636-dfff-4c1d-8bb7-149955529401-crio-socket\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.080886 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.080794 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b94a5636-dfff-4c1d-8bb7-149955529401-data-volume\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.080886 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.080817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b94a5636-dfff-4c1d-8bb7-149955529401-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.080886 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:51.080837 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.080886 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.080869 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/b94a5636-dfff-4c1d-8bb7-149955529401-crio-socket\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.081140 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:51.080911 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls podName:b94a5636-dfff-4c1d-8bb7-149955529401 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:51.580894355 +0000 UTC m=+154.272903275 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls") pod "insights-runtime-extractor-4zbff" (UID: "b94a5636-dfff-4c1d-8bb7-149955529401") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.081275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.081259 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/b94a5636-dfff-4c1d-8bb7-149955529401-data-volume\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.081311 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.081276 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/b94a5636-dfff-4c1d-8bb7-149955529401-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.094310 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.094280 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpnx2\" (UniqueName: \"kubernetes.io/projected/b94a5636-dfff-4c1d-8bb7-149955529401-kube-api-access-xpnx2\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.584811 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:51.584764 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:51.585015 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:51.584906 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:51.585015 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:51.584982 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls podName:b94a5636-dfff-4c1d-8bb7-149955529401 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:52.584964473 +0000 UTC m=+155.276973404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls") pod "insights-runtime-extractor-4zbff" (UID: "b94a5636-dfff-4c1d-8bb7-149955529401") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:52.592662 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:52.592615 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:52.593058 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:52.592783 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:52.593058 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:52.592849 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls podName:b94a5636-dfff-4c1d-8bb7-149955529401 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:54.592833373 +0000 UTC m=+157.284842291 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls") pod "insights-runtime-extractor-4zbff" (UID: "b94a5636-dfff-4c1d-8bb7-149955529401") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.608966 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:54.608932 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:54.609290 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:54.609088 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.609290 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:54.609165 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls podName:b94a5636-dfff-4c1d-8bb7-149955529401 nodeName:}" failed. No retries permitted until 2026-04-20 19:27:58.609147291 +0000 UTC m=+161.301156217 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls") pod "insights-runtime-extractor-4zbff" (UID: "b94a5636-dfff-4c1d-8bb7-149955529401") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:54.730520 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:54.730480 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" Apr 20 19:27:54.763750 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:54.763722 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" podUID="667e99fd-c507-4e05-a425-bda15ee82168" Apr 20 19:27:54.769938 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:54.769904 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-tf6mk" podUID="1c8d3c86-b40b-483d-8c21-32ed7bd9f45e" Apr 20 19:27:54.778048 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:54.778027 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-s4kbs" podUID="cc2bb5fd-130e-4be9-a03f-eb9ac877ee23" Apr 20 19:27:55.409320 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:55.409289 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:27:55.409487 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:55.409449 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tf6mk" Apr 20 19:27:55.980175 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:55.980123 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-cvbdc" podUID="220904c2-fd82-44e5-9904-33aeca86dcee" Apr 20 19:27:58.641182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:58.641144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:27:58.641613 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:58.641268 2570 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 20 19:27:58.641613 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:58.641334 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls podName:b94a5636-dfff-4c1d-8bb7-149955529401 nodeName:}" failed. No retries permitted until 2026-04-20 19:28:06.641319264 +0000 UTC m=+169.333328180 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls") pod "insights-runtime-extractor-4zbff" (UID: "b94a5636-dfff-4c1d-8bb7-149955529401") : secret "insights-runtime-extractor-tls" not found Apr 20 19:27:59.750443 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.750410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:27:59.750443 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.750453 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:27:59.750884 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.750482 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:27:59.750884 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:59.750612 2570 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 20 19:27:59.750884 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:27:59.750674 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert podName:667e99fd-c507-4e05-a425-bda15ee82168 nodeName:}" failed. No retries permitted until 2026-04-20 19:30:01.750658196 +0000 UTC m=+284.442667121 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-v8fl7" (UID: "667e99fd-c507-4e05-a425-bda15ee82168") : secret "networking-console-plugin-cert" not found Apr 20 19:27:59.752816 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.752783 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c8d3c86-b40b-483d-8c21-32ed7bd9f45e-metrics-tls\") pod \"dns-default-tf6mk\" (UID: \"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e\") " pod="openshift-dns/dns-default-tf6mk" Apr 20 19:27:59.752956 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.752936 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"image-registry-54bd76d48b-p874f\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:27:59.851634 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.851582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:27:59.854034 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.853987 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/cc2bb5fd-130e-4be9-a03f-eb9ac877ee23-cert\") pod \"ingress-canary-s4kbs\" (UID: \"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23\") " pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:27:59.913629 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.913584 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-5vnpt\"" Apr 20 19:27:59.913629 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.913584 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8vchn\"" Apr 20 19:27:59.920674 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.920650 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tf6mk" Apr 20 19:27:59.920751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:27:59.920686 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:28:00.049798 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.049698 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tf6mk"] Apr 20 19:28:00.052350 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:28:00.052317 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c8d3c86_b40b_483d_8c21_32ed7bd9f45e.slice/crio-04ab9343aa85497c44d4fc76339416d57296a44f5673dddc1140b2f76d6cbece WatchSource:0}: Error finding container 04ab9343aa85497c44d4fc76339416d57296a44f5673dddc1140b2f76d6cbece: Status 404 returned error can't find the container with id 04ab9343aa85497c44d4fc76339416d57296a44f5673dddc1140b2f76d6cbece Apr 20 19:28:00.069391 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.069362 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-54bd76d48b-p874f"] Apr 20 19:28:00.072416 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:28:00.072385 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8fe2540_8adc_4eff_9a2b_d4fb05979bbd.slice/crio-1d8f8048198d3c0f28a5b3c4f5091e03fc223f9d4fde6993683d8d0d11b6802d WatchSource:0}: Error finding container 1d8f8048198d3c0f28a5b3c4f5091e03fc223f9d4fde6993683d8d0d11b6802d: Status 404 returned error can't find the container with id 1d8f8048198d3c0f28a5b3c4f5091e03fc223f9d4fde6993683d8d0d11b6802d Apr 20 19:28:00.422363 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.422276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" event={"ID":"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd","Type":"ContainerStarted","Data":"991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326"} Apr 20 19:28:00.422363 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.422317 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" event={"ID":"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd","Type":"ContainerStarted","Data":"1d8f8048198d3c0f28a5b3c4f5091e03fc223f9d4fde6993683d8d0d11b6802d"} Apr 20 19:28:00.422589 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.422396 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:28:00.423363 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.423339 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tf6mk" event={"ID":"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e","Type":"ContainerStarted","Data":"04ab9343aa85497c44d4fc76339416d57296a44f5673dddc1140b2f76d6cbece"} Apr 20 19:28:00.444396 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:00.444346 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" podStartSLOduration=162.444331926 podStartE2EDuration="2m42.444331926s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:28:00.443693505 +0000 UTC m=+163.135702457" watchObservedRunningTime="2026-04-20 19:28:00.444331926 +0000 UTC m=+163.136340863" Apr 20 19:28:02.430091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:02.430056 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tf6mk" event={"ID":"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e","Type":"ContainerStarted","Data":"92774ae5f17b93390aabac8061d8d6dd3201ff8a9aaaec6c53aa4f104ce2a14f"} Apr 20 19:28:02.430091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:02.430093 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tf6mk" event={"ID":"1c8d3c86-b40b-483d-8c21-32ed7bd9f45e","Type":"ContainerStarted","Data":"2a7bd1d4fdb0a26ed4ec30e8a2d0d9a0e916e0c44bdb10567ccbe36ca68d4eab"} Apr 20 19:28:02.430520 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:02.430206 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-tf6mk" Apr 20 19:28:02.448309 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:02.448266 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tf6mk" podStartSLOduration=129.784883773 podStartE2EDuration="2m11.448254796s" podCreationTimestamp="2026-04-20 19:25:51 +0000 UTC" firstStartedPulling="2026-04-20 19:28:00.054171661 +0000 UTC m=+162.746180581" lastFinishedPulling="2026-04-20 19:28:01.717542674 +0000 UTC m=+164.409551604" observedRunningTime="2026-04-20 19:28:02.447970701 +0000 UTC m=+165.139979641" watchObservedRunningTime="2026-04-20 19:28:02.448254796 +0000 UTC m=+165.140263735" Apr 20 19:28:05.142484 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.142393 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" podUID="f5c6d6e2-e6b9-4777-981d-56a8a97d3208" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.11:8000/readyz\": dial tcp 10.132.0.11:8000: connect: connection refused" Apr 20 19:28:05.438791 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.438697 2570 generic.go:358] "Generic (PLEG): container finished" podID="f5c6d6e2-e6b9-4777-981d-56a8a97d3208" containerID="c48316f4d78ee39a7af28004b86543f85b9997f0378af7a58f3e716025d2ae98" exitCode=1 Apr 20 19:28:05.438939 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.438780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" event={"ID":"f5c6d6e2-e6b9-4777-981d-56a8a97d3208","Type":"ContainerDied","Data":"c48316f4d78ee39a7af28004b86543f85b9997f0378af7a58f3e716025d2ae98"} Apr 20 19:28:05.439159 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.439144 2570 scope.go:117] "RemoveContainer" containerID="c48316f4d78ee39a7af28004b86543f85b9997f0378af7a58f3e716025d2ae98" Apr 20 19:28:05.439916 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.439897 2570 generic.go:358] "Generic (PLEG): container finished" podID="25969649-2fe5-49c7-afb9-3559488fc423" containerID="6f6325ae6e6334d8f73a0020e4c5947feed107a0eb7f0c07f9874e03bb5a92e3" exitCode=255 Apr 20 19:28:05.439964 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.439953 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" event={"ID":"25969649-2fe5-49c7-afb9-3559488fc423","Type":"ContainerDied","Data":"6f6325ae6e6334d8f73a0020e4c5947feed107a0eb7f0c07f9874e03bb5a92e3"} Apr 20 19:28:05.440270 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:05.440254 2570 scope.go:117] "RemoveContainer" containerID="6f6325ae6e6334d8f73a0020e4c5947feed107a0eb7f0c07f9874e03bb5a92e3" Apr 20 19:28:06.448172 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.448138 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" event={"ID":"f5c6d6e2-e6b9-4777-981d-56a8a97d3208","Type":"ContainerStarted","Data":"4d483bdf70e017a48ce5f1992db5045bf829636705c872614182fcf3f461d220"} Apr 20 19:28:06.448621 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.448455 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:28:06.449139 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.449119 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-5b4d8dfcd-p5dsh" Apr 20 19:28:06.449754 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.449704 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5cd7bb95bf-cfnj9" event={"ID":"25969649-2fe5-49c7-afb9-3559488fc423","Type":"ContainerStarted","Data":"3f456b0970655809a86617a1a2030a5cafcf37ee4893af6ebc905d231a949198"} Apr 20 19:28:06.708402 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.708308 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:28:06.710582 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.710537 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/b94a5636-dfff-4c1d-8bb7-149955529401-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4zbff\" (UID: \"b94a5636-dfff-4c1d-8bb7-149955529401\") " pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:28:06.801492 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.801442 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4zbff" Apr 20 19:28:06.916746 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:06.916714 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4zbff"] Apr 20 19:28:06.920635 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:28:06.920603 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb94a5636_dfff_4c1d_8bb7_149955529401.slice/crio-bf3f1f11e1e0b5b50dbef78e1c00c9f86fe05eb24b0e51086db18e8f33d0c801 WatchSource:0}: Error finding container bf3f1f11e1e0b5b50dbef78e1c00c9f86fe05eb24b0e51086db18e8f33d0c801: Status 404 returned error can't find the container with id bf3f1f11e1e0b5b50dbef78e1c00c9f86fe05eb24b0e51086db18e8f33d0c801 Apr 20 19:28:07.453800 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.453755 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4zbff" event={"ID":"b94a5636-dfff-4c1d-8bb7-149955529401","Type":"ContainerStarted","Data":"62492373cad8746ef50022f71be79f70c21e2454e74ab6e7dfd7c82da7816d7a"} Apr 20 19:28:07.454257 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.453811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4zbff" event={"ID":"b94a5636-dfff-4c1d-8bb7-149955529401","Type":"ContainerStarted","Data":"bf3f1f11e1e0b5b50dbef78e1c00c9f86fe05eb24b0e51086db18e8f33d0c801"} Apr 20 19:28:07.960521 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.960489 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:28:07.960721 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.960615 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:28:07.960721 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.960653 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:28:07.963362 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.963336 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-sbf4z\"" Apr 20 19:28:07.971063 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:07.971039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-s4kbs" Apr 20 19:28:08.097750 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:08.097715 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-s4kbs"] Apr 20 19:28:08.100815 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:28:08.100786 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc2bb5fd_130e_4be9_a03f_eb9ac877ee23.slice/crio-46aad47eeede5ee81736968f1dd32613e9e8a05fa0fdacbf5db45371dbcb0c23 WatchSource:0}: Error finding container 46aad47eeede5ee81736968f1dd32613e9e8a05fa0fdacbf5db45371dbcb0c23: Status 404 returned error can't find the container with id 46aad47eeede5ee81736968f1dd32613e9e8a05fa0fdacbf5db45371dbcb0c23 Apr 20 19:28:08.458148 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:08.458112 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s4kbs" event={"ID":"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23","Type":"ContainerStarted","Data":"46aad47eeede5ee81736968f1dd32613e9e8a05fa0fdacbf5db45371dbcb0c23"} Apr 20 19:28:08.460122 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:08.460081 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4zbff" event={"ID":"b94a5636-dfff-4c1d-8bb7-149955529401","Type":"ContainerStarted","Data":"bad783deb04746e824abe02b9571158de046a846973382e37f8edd89284024c3"} Apr 20 19:28:09.465015 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:09.464973 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4zbff" event={"ID":"b94a5636-dfff-4c1d-8bb7-149955529401","Type":"ContainerStarted","Data":"0d6b5a9a46e7898e875248f8894e32ac87c5177860413cb7a7c400d486d34e24"} Apr 20 19:28:09.481866 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:09.481810 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4zbff" podStartSLOduration=17.509706654 podStartE2EDuration="19.48179107s" podCreationTimestamp="2026-04-20 19:27:50 +0000 UTC" firstStartedPulling="2026-04-20 19:28:06.981600155 +0000 UTC m=+169.673609072" lastFinishedPulling="2026-04-20 19:28:08.953684567 +0000 UTC m=+171.645693488" observedRunningTime="2026-04-20 19:28:09.481137287 +0000 UTC m=+172.173146226" watchObservedRunningTime="2026-04-20 19:28:09.48179107 +0000 UTC m=+172.173800003" Apr 20 19:28:10.468889 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:10.468850 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-s4kbs" event={"ID":"cc2bb5fd-130e-4be9-a03f-eb9ac877ee23","Type":"ContainerStarted","Data":"aa7108690864a6bcd9b50d5ca15d2a75f624230043d0822a2f0b17dfa9245a67"} Apr 20 19:28:10.484857 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:10.484805 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-s4kbs" podStartSLOduration=138.037146526 podStartE2EDuration="2m19.484788556s" podCreationTimestamp="2026-04-20 19:25:51 +0000 UTC" firstStartedPulling="2026-04-20 19:28:08.10346191 +0000 UTC m=+170.795470830" lastFinishedPulling="2026-04-20 19:28:09.551103941 +0000 UTC m=+172.243112860" observedRunningTime="2026-04-20 19:28:10.484215382 +0000 UTC m=+173.176224333" watchObservedRunningTime="2026-04-20 19:28:10.484788556 +0000 UTC m=+173.176797496" Apr 20 19:28:12.435421 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:12.435390 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tf6mk" Apr 20 19:28:19.925084 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:19.925049 2570 patch_prober.go:28] interesting pod/image-registry-54bd76d48b-p874f container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 19:28:19.925451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:19.925105 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:28:21.434735 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.434705 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:28:21.728051 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.728006 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jp456"] Apr 20 19:28:21.732100 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.732077 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.734917 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.734896 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 19:28:21.735050 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.734923 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 19:28:21.735185 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.735164 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 19:28:21.735185 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.735177 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 19:28:21.735349 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.735219 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 19:28:21.735349 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.735235 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-t69jz\"" Apr 20 19:28:21.735513 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.735497 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 19:28:21.808250 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808214 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvtnt\" (UniqueName: \"kubernetes.io/projected/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-kube-api-access-vvtnt\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808250 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808254 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-wtmp\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808282 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-accelerators-collector-config\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808357 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808409 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-textfile\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808451 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-sys\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808612 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808457 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-root\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808612 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808472 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-metrics-client-ca\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.808612 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.808522 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-tls\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909673 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vvtnt\" (UniqueName: \"kubernetes.io/projected/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-kube-api-access-vvtnt\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909673 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909675 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-wtmp\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909697 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-accelerators-collector-config\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-textfile\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909770 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-sys\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909790 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-root\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909814 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-metrics-client-ca\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909866 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-wtmp\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909880 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-sys\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.909925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-root\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.910353 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.909924 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-tls\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.910353 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:28:21.910074 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 20 19:28:21.910353 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:28:21.910149 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-tls podName:20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e nodeName:}" failed. No retries permitted until 2026-04-20 19:28:22.410127895 +0000 UTC m=+185.102136822 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-tls") pod "node-exporter-jp456" (UID: "20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e") : secret "node-exporter-tls" not found Apr 20 19:28:21.910353 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.910184 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-textfile\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.910353 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.910295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-metrics-client-ca\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.910532 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.910403 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-accelerators-collector-config\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.912215 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.912193 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:21.918923 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:21.918900 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvtnt\" (UniqueName: \"kubernetes.io/projected/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-kube-api-access-vvtnt\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:22.414137 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:22.414103 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-tls\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:22.416328 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:22.416306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e-node-exporter-tls\") pod \"node-exporter-jp456\" (UID: \"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e\") " pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:22.640876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:22.640841 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jp456" Apr 20 19:28:22.649299 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:28:22.649262 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20e8e9b6_e7da_4dc2_ae87_ecd10feafb2e.slice/crio-cead36c0381bc59c850e0f1e52ad3d813af8a7fc84cc1c3ec6d5c31a4390407c WatchSource:0}: Error finding container cead36c0381bc59c850e0f1e52ad3d813af8a7fc84cc1c3ec6d5c31a4390407c: Status 404 returned error can't find the container with id cead36c0381bc59c850e0f1e52ad3d813af8a7fc84cc1c3ec6d5c31a4390407c Apr 20 19:28:23.503600 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:23.503552 2570 generic.go:358] "Generic (PLEG): container finished" podID="20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e" containerID="d79d5bd208c08c03f7d104f45dfeac4630fdc6b92eff20938b730752b07f0609" exitCode=0 Apr 20 19:28:23.503757 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:23.503605 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jp456" event={"ID":"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e","Type":"ContainerDied","Data":"d79d5bd208c08c03f7d104f45dfeac4630fdc6b92eff20938b730752b07f0609"} Apr 20 19:28:23.503757 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:23.503635 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jp456" event={"ID":"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e","Type":"ContainerStarted","Data":"cead36c0381bc59c850e0f1e52ad3d813af8a7fc84cc1c3ec6d5c31a4390407c"} Apr 20 19:28:24.507593 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:24.507527 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jp456" event={"ID":"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e","Type":"ContainerStarted","Data":"4d6a0ed5e17043e8c44df16c55f211cc677515cf62664d03465de1483fc9fe14"} Apr 20 19:28:24.507972 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:24.507599 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jp456" event={"ID":"20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e","Type":"ContainerStarted","Data":"7bd577491cb796714ea96ebcf54b84bbf6f60c5f5824d56b67efcd704b6d1d90"} Apr 20 19:28:24.526516 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:24.526468 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jp456" podStartSLOduration=2.892489265 podStartE2EDuration="3.526454603s" podCreationTimestamp="2026-04-20 19:28:21 +0000 UTC" firstStartedPulling="2026-04-20 19:28:22.651475526 +0000 UTC m=+185.343484457" lastFinishedPulling="2026-04-20 19:28:23.285440873 +0000 UTC m=+185.977449795" observedRunningTime="2026-04-20 19:28:24.525642894 +0000 UTC m=+187.217651834" watchObservedRunningTime="2026-04-20 19:28:24.526454603 +0000 UTC m=+187.218463591" Apr 20 19:28:36.226774 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:36.226739 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54bd76d48b-p874f"] Apr 20 19:28:47.978176 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:47.978135 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" podUID="f1d91da9-119a-49fe-968a-bcf1a2a01e5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:28:48.105737 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:48.105705 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s4kbs_cc2bb5fd-130e-4be9-a03f-eb9ac877ee23/serve-healthcheck-canary/0.log" Apr 20 19:28:57.978874 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:28:57.978838 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" podUID="f1d91da9-119a-49fe-968a-bcf1a2a01e5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:29:01.244889 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.244827 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerName="registry" containerID="cri-o://991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326" gracePeriod=30 Apr 20 19:29:01.483182 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.483159 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:29:01.602745 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.602663 2570 generic.go:358] "Generic (PLEG): container finished" podID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerID="991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326" exitCode=0 Apr 20 19:29:01.602745 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.602725 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" Apr 20 19:29:01.602961 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.602747 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" event={"ID":"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd","Type":"ContainerDied","Data":"991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326"} Apr 20 19:29:01.602961 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.602781 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" event={"ID":"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd","Type":"ContainerDied","Data":"1d8f8048198d3c0f28a5b3c4f5091e03fc223f9d4fde6993683d8d0d11b6802d"} Apr 20 19:29:01.602961 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.602797 2570 scope.go:117] "RemoveContainer" containerID="991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326" Apr 20 19:29:01.610066 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.610045 2570 scope.go:117] "RemoveContainer" containerID="991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326" Apr 20 19:29:01.610350 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:29:01.610330 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326\": container with ID starting with 991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326 not found: ID does not exist" containerID="991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326" Apr 20 19:29:01.610401 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.610360 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326"} err="failed to get container status \"991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326\": rpc error: code = NotFound desc = could not find container \"991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326\": container with ID starting with 991c542dec1218b309f0c2d721836dcee8962d187859da51ba43cf4e7da03326 not found: ID does not exist" Apr 20 19:29:01.630645 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630618 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-installation-pull-secrets\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.630754 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630650 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-ca-trust-extracted\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.630754 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630679 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-bound-sa-token\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.630754 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630702 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-24fgm\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-kube-api-access-24fgm\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.630913 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630824 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.630913 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630858 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-certificates\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.630913 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630884 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-trusted-ca\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.631064 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.630946 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-image-registry-private-configuration\") pod \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\" (UID: \"f8fe2540-8adc-4eff-9a2b-d4fb05979bbd\") " Apr 20 19:29:01.631470 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.631433 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:01.631741 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.631714 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:29:01.633139 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.633111 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-kube-api-access-24fgm" (OuterVolumeSpecName: "kube-api-access-24fgm") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "kube-api-access-24fgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:01.633246 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.633188 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:01.633246 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.633239 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:01.633442 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.633357 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:29:01.633442 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.633405 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 19:29:01.639984 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.639959 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" (UID: "f8fe2540-8adc-4eff-9a2b-d4fb05979bbd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:29:01.732058 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732011 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-image-registry-private-configuration\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732058 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732056 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-installation-pull-secrets\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732058 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732067 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-ca-trust-extracted\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732258 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732076 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-bound-sa-token\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732258 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732086 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-24fgm\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-kube-api-access-24fgm\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732258 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732095 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-tls\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732258 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732104 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-registry-certificates\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.732258 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.732112 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd-trusted-ca\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:29:01.922501 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.922467 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-54bd76d48b-p874f"] Apr 20 19:29:01.926255 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.926232 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-54bd76d48b-p874f"] Apr 20 19:29:01.961929 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:01.961885 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" path="/var/lib/kubelet/pods/f8fe2540-8adc-4eff-9a2b-d4fb05979bbd/volumes" Apr 20 19:29:06.427624 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:06.427579 2570 patch_prober.go:28] interesting pod/image-registry-54bd76d48b-p874f container/registry namespace/openshift-image-registry: Readiness probe status=failure output="Get \"https://10.132.0.6:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Apr 20 19:29:06.428047 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:06.427644 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-54bd76d48b-p874f" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerName="registry" probeResult="failure" output="Get \"https://10.132.0.6:5000/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Apr 20 19:29:07.978338 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:07.978299 2570 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" podUID="f1d91da9-119a-49fe-968a-bcf1a2a01e5e" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 20 19:29:07.978691 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:07.978370 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" Apr 20 19:29:07.978840 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:07.978822 2570 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"15b6fde7adf9f9bf71255967bf56fcff501ad9376676247c0126da65661c1a1a"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 20 19:29:07.978878 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:07.978858 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" podUID="f1d91da9-119a-49fe-968a-bcf1a2a01e5e" containerName="service-proxy" containerID="cri-o://15b6fde7adf9f9bf71255967bf56fcff501ad9376676247c0126da65661c1a1a" gracePeriod=30 Apr 20 19:29:08.622947 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:08.622911 2570 generic.go:358] "Generic (PLEG): container finished" podID="f1d91da9-119a-49fe-968a-bcf1a2a01e5e" containerID="15b6fde7adf9f9bf71255967bf56fcff501ad9376676247c0126da65661c1a1a" exitCode=2 Apr 20 19:29:08.623118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:08.622964 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" event={"ID":"f1d91da9-119a-49fe-968a-bcf1a2a01e5e","Type":"ContainerDied","Data":"15b6fde7adf9f9bf71255967bf56fcff501ad9376676247c0126da65661c1a1a"} Apr 20 19:29:08.623118 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:08.622996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-867d5dddf7-bkp7t" event={"ID":"f1d91da9-119a-49fe-968a-bcf1a2a01e5e","Type":"ContainerStarted","Data":"d7ca72d3d406d3fefab464625475c5bb7da535283bc14d693e399fc88ff3e1c2"} Apr 20 19:29:29.842188 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:29.842092 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:29:29.844380 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:29.844358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/220904c2-fd82-44e5-9904-33aeca86dcee-metrics-certs\") pod \"network-metrics-daemon-cvbdc\" (UID: \"220904c2-fd82-44e5-9904-33aeca86dcee\") " pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:29:29.864688 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:29.864653 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-v8fpn\"" Apr 20 19:29:29.871925 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:29.871899 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-cvbdc" Apr 20 19:29:29.990105 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:29.990074 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-cvbdc"] Apr 20 19:29:29.992889 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:29:29.992861 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod220904c2_fd82_44e5_9904_33aeca86dcee.slice/crio-155bf9631daa1485ad2fdb49bbae5888f47dbc8b12a74f82ad22e8fc1c81fad4 WatchSource:0}: Error finding container 155bf9631daa1485ad2fdb49bbae5888f47dbc8b12a74f82ad22e8fc1c81fad4: Status 404 returned error can't find the container with id 155bf9631daa1485ad2fdb49bbae5888f47dbc8b12a74f82ad22e8fc1c81fad4 Apr 20 19:29:30.682619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:30.682579 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cvbdc" event={"ID":"220904c2-fd82-44e5-9904-33aeca86dcee","Type":"ContainerStarted","Data":"155bf9631daa1485ad2fdb49bbae5888f47dbc8b12a74f82ad22e8fc1c81fad4"} Apr 20 19:29:31.686758 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:31.686719 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cvbdc" event={"ID":"220904c2-fd82-44e5-9904-33aeca86dcee","Type":"ContainerStarted","Data":"644542c9ce8ad75c7f120d8ca6b2b288579a1a6fe69cf62eab7586c7133ad692"} Apr 20 19:29:31.686758 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:31.686757 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-cvbdc" event={"ID":"220904c2-fd82-44e5-9904-33aeca86dcee","Type":"ContainerStarted","Data":"24ec96983c24988d4d3a9366a70ed1b6a5338568226d755a3fbc1792403f15a9"} Apr 20 19:29:31.702761 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:29:31.702717 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-cvbdc" podStartSLOduration=252.844126667 podStartE2EDuration="4m13.702702238s" podCreationTimestamp="2026-04-20 19:25:18 +0000 UTC" firstStartedPulling="2026-04-20 19:29:29.994704415 +0000 UTC m=+252.686713335" lastFinishedPulling="2026-04-20 19:29:30.853279985 +0000 UTC m=+253.545288906" observedRunningTime="2026-04-20 19:29:31.701498617 +0000 UTC m=+254.393507547" watchObservedRunningTime="2026-04-20 19:29:31.702702238 +0000 UTC m=+254.394711178" Apr 20 19:30:01.776881 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:01.776843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:30:01.779227 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:01.779205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/667e99fd-c507-4e05-a425-bda15ee82168-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-v8fl7\" (UID: \"667e99fd-c507-4e05-a425-bda15ee82168\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:30:01.963785 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:01.963755 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-hptb4\"" Apr 20 19:30:01.971794 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:01.971769 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" Apr 20 19:30:02.097786 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:02.097726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7"] Apr 20 19:30:02.101889 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:30:02.101855 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod667e99fd_c507_4e05_a425_bda15ee82168.slice/crio-6a5608e9fe9e7bfeb531fee55882bf9a54820e688a86b735301486fe3e717bfe WatchSource:0}: Error finding container 6a5608e9fe9e7bfeb531fee55882bf9a54820e688a86b735301486fe3e717bfe: Status 404 returned error can't find the container with id 6a5608e9fe9e7bfeb531fee55882bf9a54820e688a86b735301486fe3e717bfe Apr 20 19:30:02.767147 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:02.767108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" event={"ID":"667e99fd-c507-4e05-a425-bda15ee82168","Type":"ContainerStarted","Data":"6a5608e9fe9e7bfeb531fee55882bf9a54820e688a86b735301486fe3e717bfe"} Apr 20 19:30:03.770834 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:03.770796 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" event={"ID":"667e99fd-c507-4e05-a425-bda15ee82168","Type":"ContainerStarted","Data":"ef360359048511435603ebb7b0e098cafaa7f2522f8ac6cd1d6e90d721993ab8"} Apr 20 19:30:03.786891 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:03.786841 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-v8fl7" podStartSLOduration=270.723359651 podStartE2EDuration="4m31.78682418s" podCreationTimestamp="2026-04-20 19:25:32 +0000 UTC" firstStartedPulling="2026-04-20 19:30:02.103827497 +0000 UTC m=+284.795836413" lastFinishedPulling="2026-04-20 19:30:03.167292021 +0000 UTC m=+285.859300942" observedRunningTime="2026-04-20 19:30:03.785697482 +0000 UTC m=+286.477706421" watchObservedRunningTime="2026-04-20 19:30:03.78682418 +0000 UTC m=+286.478833119" Apr 20 19:30:17.884099 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:17.884071 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:30:17.884737 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:17.884712 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:30:17.894914 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:30:17.894895 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 19:31:19.234743 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.234664 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-84fzs"] Apr 20 19:31:19.235124 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.234917 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerName="registry" Apr 20 19:31:19.235124 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.234928 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerName="registry" Apr 20 19:31:19.235124 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.234982 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8fe2540-8adc-4eff-9a2b-d4fb05979bbd" containerName="registry" Apr 20 19:31:19.237698 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.237682 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.240365 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.240342 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 19:31:19.240541 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.240466 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 19:31:19.241431 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.241417 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-tt6mb\"" Apr 20 19:31:19.246912 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.246890 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-84fzs"] Apr 20 19:31:19.295823 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.295786 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hfd2\" (UniqueName: \"kubernetes.io/projected/137cb307-203a-44c3-bc40-2ce6e08a8124-kube-api-access-2hfd2\") pod \"cert-manager-cainjector-68b757865b-84fzs\" (UID: \"137cb307-203a-44c3-bc40-2ce6e08a8124\") " pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.295989 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.295849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137cb307-203a-44c3-bc40-2ce6e08a8124-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-84fzs\" (UID: \"137cb307-203a-44c3-bc40-2ce6e08a8124\") " pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.396796 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.396768 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137cb307-203a-44c3-bc40-2ce6e08a8124-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-84fzs\" (UID: \"137cb307-203a-44c3-bc40-2ce6e08a8124\") " pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.396887 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.396807 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2hfd2\" (UniqueName: \"kubernetes.io/projected/137cb307-203a-44c3-bc40-2ce6e08a8124-kube-api-access-2hfd2\") pod \"cert-manager-cainjector-68b757865b-84fzs\" (UID: \"137cb307-203a-44c3-bc40-2ce6e08a8124\") " pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.407845 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.407812 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/137cb307-203a-44c3-bc40-2ce6e08a8124-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-84fzs\" (UID: \"137cb307-203a-44c3-bc40-2ce6e08a8124\") " pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.407984 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.407858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hfd2\" (UniqueName: \"kubernetes.io/projected/137cb307-203a-44c3-bc40-2ce6e08a8124-kube-api-access-2hfd2\") pod \"cert-manager-cainjector-68b757865b-84fzs\" (UID: \"137cb307-203a-44c3-bc40-2ce6e08a8124\") " pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.546268 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.546183 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" Apr 20 19:31:19.668300 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.668275 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-84fzs"] Apr 20 19:31:19.670519 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:31:19.670487 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod137cb307_203a_44c3_bc40_2ce6e08a8124.slice/crio-1cd449a3fc68a391433882b110140f103da7576ea4431a1fce8e4215261ff81c WatchSource:0}: Error finding container 1cd449a3fc68a391433882b110140f103da7576ea4431a1fce8e4215261ff81c: Status 404 returned error can't find the container with id 1cd449a3fc68a391433882b110140f103da7576ea4431a1fce8e4215261ff81c Apr 20 19:31:19.672156 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.672139 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:31:19.965673 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:19.965637 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" event={"ID":"137cb307-203a-44c3-bc40-2ce6e08a8124","Type":"ContainerStarted","Data":"1cd449a3fc68a391433882b110140f103da7576ea4431a1fce8e4215261ff81c"} Apr 20 19:31:22.975494 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:22.975457 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" event={"ID":"137cb307-203a-44c3-bc40-2ce6e08a8124","Type":"ContainerStarted","Data":"d6d5d557e783200bba84a9282e68b2566cb813e7699c77a09732f494460038b8"} Apr 20 19:31:22.991619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:22.991549 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-84fzs" podStartSLOduration=1.2877061300000001 podStartE2EDuration="3.991531427s" podCreationTimestamp="2026-04-20 19:31:19 +0000 UTC" firstStartedPulling="2026-04-20 19:31:19.672264796 +0000 UTC m=+362.364273717" lastFinishedPulling="2026-04-20 19:31:22.376090097 +0000 UTC m=+365.068099014" observedRunningTime="2026-04-20 19:31:22.989888432 +0000 UTC m=+365.681897370" watchObservedRunningTime="2026-04-20 19:31:22.991531427 +0000 UTC m=+365.683540367" Apr 20 19:31:34.429122 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.429089 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl"] Apr 20 19:31:34.432080 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.432064 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.434496 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.434473 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 19:31:34.434615 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.434532 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-dslhn\"" Apr 20 19:31:34.435462 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.435441 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 19:31:34.439386 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.439365 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl"] Apr 20 19:31:34.501606 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.501572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/71df5159-6990-4676-9a0b-3dc288aa9b61-tmp\") pod \"openshift-lws-operator-bfc7f696d-sbvvl\" (UID: \"71df5159-6990-4676-9a0b-3dc288aa9b61\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.501606 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.501606 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9pw6\" (UniqueName: \"kubernetes.io/projected/71df5159-6990-4676-9a0b-3dc288aa9b61-kube-api-access-p9pw6\") pod \"openshift-lws-operator-bfc7f696d-sbvvl\" (UID: \"71df5159-6990-4676-9a0b-3dc288aa9b61\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.602482 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.602450 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/71df5159-6990-4676-9a0b-3dc288aa9b61-tmp\") pod \"openshift-lws-operator-bfc7f696d-sbvvl\" (UID: \"71df5159-6990-4676-9a0b-3dc288aa9b61\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.602482 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.602485 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9pw6\" (UniqueName: \"kubernetes.io/projected/71df5159-6990-4676-9a0b-3dc288aa9b61-kube-api-access-p9pw6\") pod \"openshift-lws-operator-bfc7f696d-sbvvl\" (UID: \"71df5159-6990-4676-9a0b-3dc288aa9b61\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.602865 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.602846 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/71df5159-6990-4676-9a0b-3dc288aa9b61-tmp\") pod \"openshift-lws-operator-bfc7f696d-sbvvl\" (UID: \"71df5159-6990-4676-9a0b-3dc288aa9b61\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.610114 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.610091 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9pw6\" (UniqueName: \"kubernetes.io/projected/71df5159-6990-4676-9a0b-3dc288aa9b61-kube-api-access-p9pw6\") pod \"openshift-lws-operator-bfc7f696d-sbvvl\" (UID: \"71df5159-6990-4676-9a0b-3dc288aa9b61\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.741421 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.741392 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" Apr 20 19:31:34.858063 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:34.858026 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl"] Apr 20 19:31:34.861643 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:31:34.861611 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod71df5159_6990_4676_9a0b_3dc288aa9b61.slice/crio-e4e15883cf776d2136930e60c05a8ae97714e3f90ad019a86b647161d6d0238c WatchSource:0}: Error finding container e4e15883cf776d2136930e60c05a8ae97714e3f90ad019a86b647161d6d0238c: Status 404 returned error can't find the container with id e4e15883cf776d2136930e60c05a8ae97714e3f90ad019a86b647161d6d0238c Apr 20 19:31:35.003793 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:35.003708 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" event={"ID":"71df5159-6990-4676-9a0b-3dc288aa9b61","Type":"ContainerStarted","Data":"e4e15883cf776d2136930e60c05a8ae97714e3f90ad019a86b647161d6d0238c"} Apr 20 19:31:38.013043 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:38.013013 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" event={"ID":"71df5159-6990-4676-9a0b-3dc288aa9b61","Type":"ContainerStarted","Data":"21d9984c8a81ca67d11daa123a3b66c508d8a843b6178421b75ed70a5a4091c3"} Apr 20 19:31:38.032157 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:38.032097 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-sbvvl" podStartSLOduration=1.441430625 podStartE2EDuration="4.032078741s" podCreationTimestamp="2026-04-20 19:31:34 +0000 UTC" firstStartedPulling="2026-04-20 19:31:34.863105051 +0000 UTC m=+377.555113968" lastFinishedPulling="2026-04-20 19:31:37.453753165 +0000 UTC m=+380.145762084" observedRunningTime="2026-04-20 19:31:38.030380299 +0000 UTC m=+380.722389238" watchObservedRunningTime="2026-04-20 19:31:38.032078741 +0000 UTC m=+380.724087680" Apr 20 19:31:53.801590 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.801540 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77"] Apr 20 19:31:53.804881 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.804860 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:53.807877 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.807854 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 19:31:53.808002 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.807877 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 19:31:53.808002 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.807959 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 19:31:53.808002 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.807965 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-cx2tl\"" Apr 20 19:31:53.808237 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.808220 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 19:31:53.826422 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.826396 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77"] Apr 20 19:31:53.939026 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.938984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9dpl\" (UniqueName: \"kubernetes.io/projected/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-kube-api-access-b9dpl\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:53.939320 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.939285 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:53.939460 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:53.939369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.040103 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.040069 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.040280 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.040120 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b9dpl\" (UniqueName: \"kubernetes.io/projected/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-kube-api-access-b9dpl\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.040280 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.040144 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.042547 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.042512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-apiservice-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.042681 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.042581 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-webhook-cert\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.049149 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.049126 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9dpl\" (UniqueName: \"kubernetes.io/projected/eaded8f8-1e1e-450b-96ba-62ad59fcfd88-kube-api-access-b9dpl\") pod \"opendatahub-operator-controller-manager-9f747d685-4dr77\" (UID: \"eaded8f8-1e1e-450b-96ba-62ad59fcfd88\") " pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.115838 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.115752 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:54.253746 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:54.253712 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77"] Apr 20 19:31:54.256945 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:31:54.256921 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeaded8f8_1e1e_450b_96ba_62ad59fcfd88.slice/crio-8775abbb2e7d9155e60f54829560e1f258076d25de43677439b4cb388af5e4f2 WatchSource:0}: Error finding container 8775abbb2e7d9155e60f54829560e1f258076d25de43677439b4cb388af5e4f2: Status 404 returned error can't find the container with id 8775abbb2e7d9155e60f54829560e1f258076d25de43677439b4cb388af5e4f2 Apr 20 19:31:55.056862 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:55.056802 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" event={"ID":"eaded8f8-1e1e-450b-96ba-62ad59fcfd88","Type":"ContainerStarted","Data":"8775abbb2e7d9155e60f54829560e1f258076d25de43677439b4cb388af5e4f2"} Apr 20 19:31:57.063044 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:57.063008 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" event={"ID":"eaded8f8-1e1e-450b-96ba-62ad59fcfd88","Type":"ContainerStarted","Data":"71e20cf48e163461aff8b388ac8c02624f13500bdfc33380c98791894abf6502"} Apr 20 19:31:57.063435 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:57.063158 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:31:57.084304 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:31:57.084255 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" podStartSLOduration=1.470432443 podStartE2EDuration="4.084239867s" podCreationTimestamp="2026-04-20 19:31:53 +0000 UTC" firstStartedPulling="2026-04-20 19:31:54.258897588 +0000 UTC m=+396.950906508" lastFinishedPulling="2026-04-20 19:31:56.872705001 +0000 UTC m=+399.564713932" observedRunningTime="2026-04-20 19:31:57.082895186 +0000 UTC m=+399.774904127" watchObservedRunningTime="2026-04-20 19:31:57.084239867 +0000 UTC m=+399.776248805" Apr 20 19:32:08.068328 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:08.068298 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-9f747d685-4dr77" Apr 20 19:32:14.330926 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.330892 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-q5v8h"] Apr 20 19:32:14.337301 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.337278 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:14.339841 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.339810 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-56szw\"" Apr 20 19:32:14.339970 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.339812 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 20 19:32:14.342387 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.342366 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-q5v8h"] Apr 20 19:32:14.492720 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.492673 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dcae940-547e-4ab6-bcf6-681672895e3e-cert\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:14.492908 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.492748 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nck7m\" (UniqueName: \"kubernetes.io/projected/7dcae940-547e-4ab6-bcf6-681672895e3e-kube-api-access-nck7m\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:14.593296 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.593202 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dcae940-547e-4ab6-bcf6-681672895e3e-cert\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:14.593296 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.593283 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nck7m\" (UniqueName: \"kubernetes.io/projected/7dcae940-547e-4ab6-bcf6-681672895e3e-kube-api-access-nck7m\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:14.593520 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:14.593361 2570 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 20 19:32:14.593520 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:14.593437 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7dcae940-547e-4ab6-bcf6-681672895e3e-cert podName:7dcae940-547e-4ab6-bcf6-681672895e3e nodeName:}" failed. No retries permitted until 2026-04-20 19:32:15.093415398 +0000 UTC m=+417.785424321 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7dcae940-547e-4ab6-bcf6-681672895e3e-cert") pod "odh-model-controller-858dbf95b8-q5v8h" (UID: "7dcae940-547e-4ab6-bcf6-681672895e3e") : secret "odh-model-controller-webhook-cert" not found Apr 20 19:32:14.602859 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:14.602826 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nck7m\" (UniqueName: \"kubernetes.io/projected/7dcae940-547e-4ab6-bcf6-681672895e3e-kube-api-access-nck7m\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:15.097177 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:15.097137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dcae940-547e-4ab6-bcf6-681672895e3e-cert\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:15.099866 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:15.099834 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7dcae940-547e-4ab6-bcf6-681672895e3e-cert\") pod \"odh-model-controller-858dbf95b8-q5v8h\" (UID: \"7dcae940-547e-4ab6-bcf6-681672895e3e\") " pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:15.247699 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:15.247659 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:15.368717 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:15.368639 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-q5v8h"] Apr 20 19:32:15.371625 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:32:15.371598 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7dcae940_547e_4ab6_bcf6_681672895e3e.slice/crio-c2d04977b2662fca8da98eb606c44c32f27e306738bceb4025ff66f2eb8e089c WatchSource:0}: Error finding container c2d04977b2662fca8da98eb606c44c32f27e306738bceb4025ff66f2eb8e089c: Status 404 returned error can't find the container with id c2d04977b2662fca8da98eb606c44c32f27e306738bceb4025ff66f2eb8e089c Apr 20 19:32:16.118610 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:16.118569 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" event={"ID":"7dcae940-547e-4ab6-bcf6-681672895e3e","Type":"ContainerStarted","Data":"c2d04977b2662fca8da98eb606c44c32f27e306738bceb4025ff66f2eb8e089c"} Apr 20 19:32:18.126265 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:18.126230 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" event={"ID":"7dcae940-547e-4ab6-bcf6-681672895e3e","Type":"ContainerStarted","Data":"605f6366d3b1e1798f28ee035f78eedc12e8af3a131e6c789ba3f34c34f8c621"} Apr 20 19:32:18.126655 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:18.126349 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:18.152092 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:18.152041 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" podStartSLOduration=1.512605814 podStartE2EDuration="4.152025031s" podCreationTimestamp="2026-04-20 19:32:14 +0000 UTC" firstStartedPulling="2026-04-20 19:32:15.372893211 +0000 UTC m=+418.064902131" lastFinishedPulling="2026-04-20 19:32:18.012312423 +0000 UTC m=+420.704321348" observedRunningTime="2026-04-20 19:32:18.150014674 +0000 UTC m=+420.842023615" watchObservedRunningTime="2026-04-20 19:32:18.152025031 +0000 UTC m=+420.844033970" Apr 20 19:32:19.129879 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:19.129847 2570 generic.go:358] "Generic (PLEG): container finished" podID="7dcae940-547e-4ab6-bcf6-681672895e3e" containerID="605f6366d3b1e1798f28ee035f78eedc12e8af3a131e6c789ba3f34c34f8c621" exitCode=1 Apr 20 19:32:19.130274 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:19.129895 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" event={"ID":"7dcae940-547e-4ab6-bcf6-681672895e3e","Type":"ContainerDied","Data":"605f6366d3b1e1798f28ee035f78eedc12e8af3a131e6c789ba3f34c34f8c621"} Apr 20 19:32:19.130274 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:19.130149 2570 scope.go:117] "RemoveContainer" containerID="605f6366d3b1e1798f28ee035f78eedc12e8af3a131e6c789ba3f34c34f8c621" Apr 20 19:32:20.136237 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.136199 2570 generic.go:358] "Generic (PLEG): container finished" podID="7dcae940-547e-4ab6-bcf6-681672895e3e" containerID="60d6fade339410d24cfeeb8ec03a5aaf42a8da6363c16c30362c785b6f4901ed" exitCode=1 Apr 20 19:32:20.136723 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.136256 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" event={"ID":"7dcae940-547e-4ab6-bcf6-681672895e3e","Type":"ContainerDied","Data":"60d6fade339410d24cfeeb8ec03a5aaf42a8da6363c16c30362c785b6f4901ed"} Apr 20 19:32:20.136723 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.136294 2570 scope.go:117] "RemoveContainer" containerID="605f6366d3b1e1798f28ee035f78eedc12e8af3a131e6c789ba3f34c34f8c621" Apr 20 19:32:20.136723 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.136593 2570 scope.go:117] "RemoveContainer" containerID="60d6fade339410d24cfeeb8ec03a5aaf42a8da6363c16c30362c785b6f4901ed" Apr 20 19:32:20.136873 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:20.136822 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-q5v8h_opendatahub(7dcae940-547e-4ab6-bcf6-681672895e3e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" podUID="7dcae940-547e-4ab6-bcf6-681672895e3e" Apr 20 19:32:20.321642 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.321605 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rkv9n"] Apr 20 19:32:20.325878 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.325856 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.328256 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.328235 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 20 19:32:20.328370 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.328311 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-7nfb5\"" Apr 20 19:32:20.335012 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.334990 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rkv9n"] Apr 20 19:32:20.443014 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.442989 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wzhpw\" (UniqueName: \"kubernetes.io/projected/c72c7707-c35d-4ef8-b575-656cc958dc1c-kube-api-access-wzhpw\") pod \"kserve-controller-manager-856948b99f-rkv9n\" (UID: \"c72c7707-c35d-4ef8-b575-656cc958dc1c\") " pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.443144 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.443049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72c7707-c35d-4ef8-b575-656cc958dc1c-cert\") pod \"kserve-controller-manager-856948b99f-rkv9n\" (UID: \"c72c7707-c35d-4ef8-b575-656cc958dc1c\") " pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.544377 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.544346 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wzhpw\" (UniqueName: \"kubernetes.io/projected/c72c7707-c35d-4ef8-b575-656cc958dc1c-kube-api-access-wzhpw\") pod \"kserve-controller-manager-856948b99f-rkv9n\" (UID: \"c72c7707-c35d-4ef8-b575-656cc958dc1c\") " pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.544542 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.544397 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72c7707-c35d-4ef8-b575-656cc958dc1c-cert\") pod \"kserve-controller-manager-856948b99f-rkv9n\" (UID: \"c72c7707-c35d-4ef8-b575-656cc958dc1c\") " pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.546774 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.546752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c72c7707-c35d-4ef8-b575-656cc958dc1c-cert\") pod \"kserve-controller-manager-856948b99f-rkv9n\" (UID: \"c72c7707-c35d-4ef8-b575-656cc958dc1c\") " pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.552598 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.552551 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wzhpw\" (UniqueName: \"kubernetes.io/projected/c72c7707-c35d-4ef8-b575-656cc958dc1c-kube-api-access-wzhpw\") pod \"kserve-controller-manager-856948b99f-rkv9n\" (UID: \"c72c7707-c35d-4ef8-b575-656cc958dc1c\") " pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.638953 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.638920 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:20.755035 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:20.754999 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-rkv9n"] Apr 20 19:32:20.758929 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:32:20.758901 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc72c7707_c35d_4ef8_b575_656cc958dc1c.slice/crio-c2282ca996b92c758b13401e01dc25236643426518ddc3f2b38b6ce2b9d49230 WatchSource:0}: Error finding container c2282ca996b92c758b13401e01dc25236643426518ddc3f2b38b6ce2b9d49230: Status 404 returned error can't find the container with id c2282ca996b92c758b13401e01dc25236643426518ddc3f2b38b6ce2b9d49230 Apr 20 19:32:21.140737 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.140649 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" event={"ID":"c72c7707-c35d-4ef8-b575-656cc958dc1c","Type":"ContainerStarted","Data":"c2282ca996b92c758b13401e01dc25236643426518ddc3f2b38b6ce2b9d49230"} Apr 20 19:32:21.142194 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.142173 2570 scope.go:117] "RemoveContainer" containerID="60d6fade339410d24cfeeb8ec03a5aaf42a8da6363c16c30362c785b6f4901ed" Apr 20 19:32:21.142355 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:21.142337 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-q5v8h_opendatahub(7dcae940-547e-4ab6-bcf6-681672895e3e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" podUID="7dcae940-547e-4ab6-bcf6-681672895e3e" Apr 20 19:32:21.732957 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.732921 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z"] Apr 20 19:32:21.736152 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.736127 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.738839 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.738813 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 20 19:32:21.738976 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.738858 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 19:32:21.739709 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.739686 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-blzjv\"" Apr 20 19:32:21.739902 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.739690 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 19:32:21.739902 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.739691 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 20 19:32:21.747372 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.747350 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z"] Apr 20 19:32:21.855542 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.855505 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3399bac-03c7-47b5-8e0f-701d3d618016-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.855728 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.855616 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6mhvr\" (UniqueName: \"kubernetes.io/projected/f3399bac-03c7-47b5-8e0f-701d3d618016-kube-api-access-6mhvr\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.855728 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.855671 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3399bac-03c7-47b5-8e0f-701d3d618016-tmp\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.956619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.956581 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3399bac-03c7-47b5-8e0f-701d3d618016-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.956820 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.956642 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6mhvr\" (UniqueName: \"kubernetes.io/projected/f3399bac-03c7-47b5-8e0f-701d3d618016-kube-api-access-6mhvr\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.956820 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.956674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3399bac-03c7-47b5-8e0f-701d3d618016-tmp\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.956820 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:21.956756 2570 secret.go:189] Couldn't get secret openshift-ingress/kube-auth-proxy-tls: secret "kube-auth-proxy-tls" not found Apr 20 19:32:21.956981 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:21.956838 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3399bac-03c7-47b5-8e0f-701d3d618016-tls-certs podName:f3399bac-03c7-47b5-8e0f-701d3d618016 nodeName:}" failed. No retries permitted until 2026-04-20 19:32:22.45681477 +0000 UTC m=+425.148823687 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/f3399bac-03c7-47b5-8e0f-701d3d618016-tls-certs") pod "kube-auth-proxy-79f76cb8cc-vj77z" (UID: "f3399bac-03c7-47b5-8e0f-701d3d618016") : secret "kube-auth-proxy-tls" not found Apr 20 19:32:21.958971 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.958948 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/f3399bac-03c7-47b5-8e0f-701d3d618016-tmp\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:21.969294 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:21.969265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6mhvr\" (UniqueName: \"kubernetes.io/projected/f3399bac-03c7-47b5-8e0f-701d3d618016-kube-api-access-6mhvr\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:22.460694 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:22.460653 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3399bac-03c7-47b5-8e0f-701d3d618016-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:22.463575 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:22.463532 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f3399bac-03c7-47b5-8e0f-701d3d618016-tls-certs\") pod \"kube-auth-proxy-79f76cb8cc-vj77z\" (UID: \"f3399bac-03c7-47b5-8e0f-701d3d618016\") " pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:22.648876 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:22.648834 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" Apr 20 19:32:23.141286 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:23.141258 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z"] Apr 20 19:32:23.144647 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:32:23.144596 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3399bac_03c7_47b5_8e0f_701d3d618016.slice/crio-0ce85e9118fb5a97c6bdc849a855868a3819f6e321c323e7409defb5d3997769 WatchSource:0}: Error finding container 0ce85e9118fb5a97c6bdc849a855868a3819f6e321c323e7409defb5d3997769: Status 404 returned error can't find the container with id 0ce85e9118fb5a97c6bdc849a855868a3819f6e321c323e7409defb5d3997769 Apr 20 19:32:24.153242 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:24.153200 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" event={"ID":"c72c7707-c35d-4ef8-b575-656cc958dc1c","Type":"ContainerStarted","Data":"389a78f1e0d342a3e340946956a562cd0776307e93c5731c137b08cc00dccd11"} Apr 20 19:32:24.153723 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:24.153345 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:24.154366 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:24.154338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" event={"ID":"f3399bac-03c7-47b5-8e0f-701d3d618016","Type":"ContainerStarted","Data":"0ce85e9118fb5a97c6bdc849a855868a3819f6e321c323e7409defb5d3997769"} Apr 20 19:32:24.170463 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:24.170411 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" podStartSLOduration=1.858178107 podStartE2EDuration="4.170397396s" podCreationTimestamp="2026-04-20 19:32:20 +0000 UTC" firstStartedPulling="2026-04-20 19:32:20.76027138 +0000 UTC m=+423.452280299" lastFinishedPulling="2026-04-20 19:32:23.072490667 +0000 UTC m=+425.764499588" observedRunningTime="2026-04-20 19:32:24.168593081 +0000 UTC m=+426.860602023" watchObservedRunningTime="2026-04-20 19:32:24.170397396 +0000 UTC m=+426.862406334" Apr 20 19:32:26.162339 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:26.162214 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" event={"ID":"f3399bac-03c7-47b5-8e0f-701d3d618016","Type":"ContainerStarted","Data":"c3fcbc6b0d4e0452ef5196f17e08ea24f8fd769d781331015d38feb6ff27d55b"} Apr 20 19:32:26.177264 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:26.177215 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-79f76cb8cc-vj77z" podStartSLOduration=2.435489705 podStartE2EDuration="5.177197964s" podCreationTimestamp="2026-04-20 19:32:21 +0000 UTC" firstStartedPulling="2026-04-20 19:32:23.146944092 +0000 UTC m=+425.838953027" lastFinishedPulling="2026-04-20 19:32:25.888652356 +0000 UTC m=+428.580661286" observedRunningTime="2026-04-20 19:32:26.176674754 +0000 UTC m=+428.868683693" watchObservedRunningTime="2026-04-20 19:32:26.177197964 +0000 UTC m=+428.869206928" Apr 20 19:32:28.127129 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:28.127098 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:28.127549 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:28.127516 2570 scope.go:117] "RemoveContainer" containerID="60d6fade339410d24cfeeb8ec03a5aaf42a8da6363c16c30362c785b6f4901ed" Apr 20 19:32:28.127772 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:32:28.127753 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-q5v8h_opendatahub(7dcae940-547e-4ab6-bcf6-681672895e3e)\"" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" podUID="7dcae940-547e-4ab6-bcf6-681672895e3e" Apr 20 19:32:35.248408 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:35.248354 2570 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:35.248846 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:35.248789 2570 scope.go:117] "RemoveContainer" containerID="60d6fade339410d24cfeeb8ec03a5aaf42a8da6363c16c30362c785b6f4901ed" Apr 20 19:32:36.191862 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.191826 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" event={"ID":"7dcae940-547e-4ab6-bcf6-681672895e3e","Type":"ContainerStarted","Data":"4617f68e927365ded351a8fea4de1206b4c286c0fc98737450e8eb9ba2417ac7"} Apr 20 19:32:36.192071 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.192055 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:36.875175 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.875135 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g"] Apr 20 19:32:36.878738 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.878720 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:36.881792 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.881772 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 20 19:32:36.882302 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.882285 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 20 19:32:36.882302 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.882293 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-r9q5w\"" Apr 20 19:32:36.894984 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:36.894962 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g"] Apr 20 19:32:37.072804 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.072770 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/67b3b2b4-3bf7-41ec-a3d1-f37face28b76-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v2c9g\" (UID: \"67b3b2b4-3bf7-41ec-a3d1-f37face28b76\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.072972 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.072823 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26fp\" (UniqueName: \"kubernetes.io/projected/67b3b2b4-3bf7-41ec-a3d1-f37face28b76-kube-api-access-p26fp\") pod \"servicemesh-operator3-55f49c5f94-v2c9g\" (UID: \"67b3b2b4-3bf7-41ec-a3d1-f37face28b76\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.174281 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.174195 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p26fp\" (UniqueName: \"kubernetes.io/projected/67b3b2b4-3bf7-41ec-a3d1-f37face28b76-kube-api-access-p26fp\") pod \"servicemesh-operator3-55f49c5f94-v2c9g\" (UID: \"67b3b2b4-3bf7-41ec-a3d1-f37face28b76\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.174281 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.174263 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/67b3b2b4-3bf7-41ec-a3d1-f37face28b76-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v2c9g\" (UID: \"67b3b2b4-3bf7-41ec-a3d1-f37face28b76\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.176760 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.176732 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/67b3b2b4-3bf7-41ec-a3d1-f37face28b76-operator-config\") pod \"servicemesh-operator3-55f49c5f94-v2c9g\" (UID: \"67b3b2b4-3bf7-41ec-a3d1-f37face28b76\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.183045 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.183016 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26fp\" (UniqueName: \"kubernetes.io/projected/67b3b2b4-3bf7-41ec-a3d1-f37face28b76-kube-api-access-p26fp\") pod \"servicemesh-operator3-55f49c5f94-v2c9g\" (UID: \"67b3b2b4-3bf7-41ec-a3d1-f37face28b76\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.187712 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.187649 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:37.317356 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:37.317250 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g"] Apr 20 19:32:38.199185 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:38.199143 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" event={"ID":"67b3b2b4-3bf7-41ec-a3d1-f37face28b76","Type":"ContainerStarted","Data":"bfb6a0c54a24bf446dbd7f80ee6b532331a0e62e72ca4aa473efe855123dd21a"} Apr 20 19:32:40.207004 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:40.206964 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" event={"ID":"67b3b2b4-3bf7-41ec-a3d1-f37face28b76","Type":"ContainerStarted","Data":"d5a04e2fa3714b8df911b4fed9f5014e55be6a747322c2c1f7db124efc8ca985"} Apr 20 19:32:40.207514 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:40.207024 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:40.228187 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:40.228136 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" podStartSLOduration=2.029282837 podStartE2EDuration="4.228120782s" podCreationTimestamp="2026-04-20 19:32:36 +0000 UTC" firstStartedPulling="2026-04-20 19:32:37.324349684 +0000 UTC m=+440.016358607" lastFinishedPulling="2026-04-20 19:32:39.523187624 +0000 UTC m=+442.215196552" observedRunningTime="2026-04-20 19:32:40.226425566 +0000 UTC m=+442.918434505" watchObservedRunningTime="2026-04-20 19:32:40.228120782 +0000 UTC m=+442.920129721" Apr 20 19:32:47.197682 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:47.197651 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-q5v8h" Apr 20 19:32:51.212411 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:51.212382 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-v2c9g" Apr 20 19:32:53.465489 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.465452 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz"] Apr 20 19:32:53.472909 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.472885 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.475594 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.475553 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 20 19:32:53.475730 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.475605 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 20 19:32:53.475730 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.475630 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 20 19:32:53.475730 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.475645 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-x7ls7\"" Apr 20 19:32:53.475730 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.475694 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 19:32:53.486483 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.486460 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz"] Apr 20 19:32:53.491133 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.491383 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491365 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.491532 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491517 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.491697 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491681 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r7mmq\" (UniqueName: \"kubernetes.io/projected/8d1b4965-d776-4148-a045-ff9e9d1ac69f-kube-api-access-r7mmq\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.491835 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491811 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.491955 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491857 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8d1b4965-d776-4148-a045-ff9e9d1ac69f-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.491955 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.491903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.592891 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.592857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593094 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.592906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593094 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.592927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593094 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.592948 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r7mmq\" (UniqueName: \"kubernetes.io/projected/8d1b4965-d776-4148-a045-ff9e9d1ac69f-kube-api-access-r7mmq\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593094 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.592972 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593094 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.593008 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8d1b4965-d776-4148-a045-ff9e9d1ac69f-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593094 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.593030 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.593663 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.593640 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.595427 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.595398 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/8d1b4965-d776-4148-a045-ff9e9d1ac69f-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.595626 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.595607 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.595777 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.595762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.595873 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.595852 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/8d1b4965-d776-4148-a045-ff9e9d1ac69f-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.600937 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.600916 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r7mmq\" (UniqueName: \"kubernetes.io/projected/8d1b4965-d776-4148-a045-ff9e9d1ac69f-kube-api-access-r7mmq\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.601091 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.601070 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8d1b4965-d776-4148-a045-ff9e9d1ac69f-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zdbjz\" (UID: \"8d1b4965-d776-4148-a045-ff9e9d1ac69f\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.782293 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.782198 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:53.933040 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:53.932950 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz"] Apr 20 19:32:53.936280 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:32:53.936250 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1b4965_d776_4148_a045_ff9e9d1ac69f.slice/crio-979ca3ca9453d377aba0d956acffa79e6d655e72176a9b0e223102bfc59777a5 WatchSource:0}: Error finding container 979ca3ca9453d377aba0d956acffa79e6d655e72176a9b0e223102bfc59777a5: Status 404 returned error can't find the container with id 979ca3ca9453d377aba0d956acffa79e6d655e72176a9b0e223102bfc59777a5 Apr 20 19:32:54.252717 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:54.252655 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" event={"ID":"8d1b4965-d776-4148-a045-ff9e9d1ac69f","Type":"ContainerStarted","Data":"979ca3ca9453d377aba0d956acffa79e6d655e72176a9b0e223102bfc59777a5"} Apr 20 19:32:55.164247 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:55.164218 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-rkv9n" Apr 20 19:32:56.129699 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:56.129650 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:32:56.129832 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:56.129744 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 20 19:32:56.262417 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:56.262373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" event={"ID":"8d1b4965-d776-4148-a045-ff9e9d1ac69f","Type":"ContainerStarted","Data":"3cf2aabe35c55b7fbe76b5641920158faf355c8f857f044a1ec4ffc04b59b641"} Apr 20 19:32:56.263188 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:56.263159 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:32:56.284619 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:56.284554 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" podStartSLOduration=1.093959102 podStartE2EDuration="3.284540255s" podCreationTimestamp="2026-04-20 19:32:53 +0000 UTC" firstStartedPulling="2026-04-20 19:32:53.938820518 +0000 UTC m=+456.630829435" lastFinishedPulling="2026-04-20 19:32:56.129401656 +0000 UTC m=+458.821410588" observedRunningTime="2026-04-20 19:32:56.283319481 +0000 UTC m=+458.975328431" watchObservedRunningTime="2026-04-20 19:32:56.284540255 +0000 UTC m=+458.976549193" Apr 20 19:32:57.267636 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:57.267603 2570 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-zdbjz container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 20 19:32:57.268009 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:57.267658 2570 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" podUID="8d1b4965-d776-4148-a045-ff9e9d1ac69f" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 19:32:58.270154 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:32:58.270122 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zdbjz" Apr 20 19:33:54.974745 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.974710 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll"] Apr 20 19:33:54.982865 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.982840 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:54.985662 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.985630 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-5fq6h\"" Apr 20 19:33:54.985822 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.985719 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 19:33:54.985899 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.985874 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 19:33:54.988222 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.988202 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll"] Apr 20 19:33:54.992347 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.992321 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxql2\" (UniqueName: \"kubernetes.io/projected/8d939ab9-0297-4281-8007-c6811b44dba2-kube-api-access-kxql2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:54.992444 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:54.992422 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d939ab9-0297-4281-8007-c6811b44dba2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:55.093513 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.093474 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d939ab9-0297-4281-8007-c6811b44dba2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:55.093732 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.093534 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kxql2\" (UniqueName: \"kubernetes.io/projected/8d939ab9-0297-4281-8007-c6811b44dba2-kube-api-access-kxql2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:55.093904 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.093878 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d939ab9-0297-4281-8007-c6811b44dba2-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:55.101517 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.101486 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kxql2\" (UniqueName: \"kubernetes.io/projected/8d939ab9-0297-4281-8007-c6811b44dba2-kube-api-access-kxql2\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:55.294241 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.294156 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:33:55.420412 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.420377 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll"] Apr 20 19:33:55.423795 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:33:55.423757 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d939ab9_0297_4281_8007_c6811b44dba2.slice/crio-4586cf5cfd85c04332106ffd06c7b216977576d7b8622e455e4ba3b3769182dd WatchSource:0}: Error finding container 4586cf5cfd85c04332106ffd06c7b216977576d7b8622e455e4ba3b3769182dd: Status 404 returned error can't find the container with id 4586cf5cfd85c04332106ffd06c7b216977576d7b8622e455e4ba3b3769182dd Apr 20 19:33:55.443984 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:33:55.443949 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" event={"ID":"8d939ab9-0297-4281-8007-c6811b44dba2","Type":"ContainerStarted","Data":"4586cf5cfd85c04332106ffd06c7b216977576d7b8622e455e4ba3b3769182dd"} Apr 20 19:34:00.463534 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:00.463499 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" event={"ID":"8d939ab9-0297-4281-8007-c6811b44dba2","Type":"ContainerStarted","Data":"7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f"} Apr 20 19:34:00.463955 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:00.463596 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:34:00.484160 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:00.484102 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" podStartSLOduration=1.529668283 podStartE2EDuration="6.484084126s" podCreationTimestamp="2026-04-20 19:33:54 +0000 UTC" firstStartedPulling="2026-04-20 19:33:55.426250507 +0000 UTC m=+518.118259437" lastFinishedPulling="2026-04-20 19:34:00.380666363 +0000 UTC m=+523.072675280" observedRunningTime="2026-04-20 19:34:00.481921991 +0000 UTC m=+523.173930956" watchObservedRunningTime="2026-04-20 19:34:00.484084126 +0000 UTC m=+523.176093067" Apr 20 19:34:11.469530 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:11.469498 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:34:12.413455 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.413416 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t"] Apr 20 19:34:12.416816 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.416792 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.432543 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.432519 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t"] Apr 20 19:34:12.516861 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.516823 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xj2lr\" (UniqueName: \"kubernetes.io/projected/caf83930-da53-4f66-a458-a6f544ec7096-kube-api-access-xj2lr\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.517232 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.516871 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/caf83930-da53-4f66-a458-a6f544ec7096-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.617412 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.617374 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xj2lr\" (UniqueName: \"kubernetes.io/projected/caf83930-da53-4f66-a458-a6f544ec7096-kube-api-access-xj2lr\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.617554 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.617433 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/caf83930-da53-4f66-a458-a6f544ec7096-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.617895 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.617876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/caf83930-da53-4f66-a458-a6f544ec7096-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.627117 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.627095 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xj2lr\" (UniqueName: \"kubernetes.io/projected/caf83930-da53-4f66-a458-a6f544ec7096-kube-api-access-xj2lr\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.726786 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.726749 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:12.849934 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:12.849900 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t"] Apr 20 19:34:12.853712 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:34:12.853683 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf83930_da53_4f66_a458_a6f544ec7096.slice/crio-c52d2bb67afc315befa32cbf2e030d563893952db1db021121c6242b2325a623 WatchSource:0}: Error finding container c52d2bb67afc315befa32cbf2e030d563893952db1db021121c6242b2325a623: Status 404 returned error can't find the container with id c52d2bb67afc315befa32cbf2e030d563893952db1db021121c6242b2325a623 Apr 20 19:34:13.109843 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.109755 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll"] Apr 20 19:34:13.110320 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.110281 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" containerName="manager" containerID="cri-o://7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f" gracePeriod=2 Apr 20 19:34:13.118497 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.118469 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll"] Apr 20 19:34:13.129737 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.129713 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t"] Apr 20 19:34:13.136157 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.136133 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx"] Apr 20 19:34:13.136459 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.136447 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" containerName="manager" Apr 20 19:34:13.136506 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.136462 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" containerName="manager" Apr 20 19:34:13.136540 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.136512 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" containerName="manager" Apr 20 19:34:13.139648 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.139628 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.144137 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.144106 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t"] Apr 20 19:34:13.158980 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.158952 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx"] Apr 20 19:34:13.161623 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.161602 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t"] Apr 20 19:34:13.161916 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.161902 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="caf83930-da53-4f66-a458-a6f544ec7096" containerName="manager" Apr 20 19:34:13.161969 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.161918 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf83930-da53-4f66-a458-a6f544ec7096" containerName="manager" Apr 20 19:34:13.162011 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.161971 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="caf83930-da53-4f66-a458-a6f544ec7096" containerName="manager" Apr 20 19:34:13.165067 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.165048 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.175536 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.175511 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t"] Apr 20 19:34:13.184577 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.181621 2570 status_manager.go:895] "Failed to get status for pod" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.205102 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.205060 2570 status_manager.go:895] "Failed to get status for pod" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.222690 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.222658 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hpq7\" (UniqueName: \"kubernetes.io/projected/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-kube-api-access-7hpq7\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.222848 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.222727 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1658cae7-6dd2-4555-8304-e042094d37c1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-bnmmx\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.222848 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.222760 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbpgm\" (UniqueName: \"kubernetes.io/projected/1658cae7-6dd2-4555-8304-e042094d37c1-kube-api-access-mbpgm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-bnmmx\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.222848 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.222784 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.323371 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.323336 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1658cae7-6dd2-4555-8304-e042094d37c1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-bnmmx\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.323572 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.323383 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbpgm\" (UniqueName: \"kubernetes.io/projected/1658cae7-6dd2-4555-8304-e042094d37c1-kube-api-access-mbpgm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-bnmmx\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.323572 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.323423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.323572 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.323470 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hpq7\" (UniqueName: \"kubernetes.io/projected/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-kube-api-access-7hpq7\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.323845 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.323818 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1658cae7-6dd2-4555-8304-e042094d37c1-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-bnmmx\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.323920 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.323896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.329065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.329045 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:34:13.331235 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.331209 2570 status_manager.go:895] "Failed to get status for pod" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.332595 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.332547 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hpq7\" (UniqueName: \"kubernetes.io/projected/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-kube-api-access-7hpq7\") pod \"kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.332681 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.332601 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbpgm\" (UniqueName: \"kubernetes.io/projected/1658cae7-6dd2-4555-8304-e042094d37c1-kube-api-access-mbpgm\") pod \"kuadrant-operator-controller-manager-5f895dd7d5-bnmmx\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.424579 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.424479 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d939ab9-0297-4281-8007-c6811b44dba2-extensions-socket-volume\") pod \"8d939ab9-0297-4281-8007-c6811b44dba2\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " Apr 20 19:34:13.424579 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.424539 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kxql2\" (UniqueName: \"kubernetes.io/projected/8d939ab9-0297-4281-8007-c6811b44dba2-kube-api-access-kxql2\") pod \"8d939ab9-0297-4281-8007-c6811b44dba2\" (UID: \"8d939ab9-0297-4281-8007-c6811b44dba2\") " Apr 20 19:34:13.424965 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.424940 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d939ab9-0297-4281-8007-c6811b44dba2-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "8d939ab9-0297-4281-8007-c6811b44dba2" (UID: "8d939ab9-0297-4281-8007-c6811b44dba2"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:34:13.426776 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.426756 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d939ab9-0297-4281-8007-c6811b44dba2-kube-api-access-kxql2" (OuterVolumeSpecName: "kube-api-access-kxql2") pod "8d939ab9-0297-4281-8007-c6811b44dba2" (UID: "8d939ab9-0297-4281-8007-c6811b44dba2"). InnerVolumeSpecName "kube-api-access-kxql2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:13.489004 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.488967 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:13.497903 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.497874 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:13.506703 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:34:13.506684 2570 kuberuntime_manager.go:623] "Missing actuated resource record" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" container="manager" Apr 20 19:34:13.507678 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.507652 2570 generic.go:358] "Generic (PLEG): container finished" podID="8d939ab9-0297-4281-8007-c6811b44dba2" containerID="7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f" exitCode=0 Apr 20 19:34:13.507774 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.507699 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" Apr 20 19:34:13.507774 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.507708 2570 scope.go:117] "RemoveContainer" containerID="7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f" Apr 20 19:34:13.508958 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.508924 2570 status_manager.go:895] "Failed to get status for pod" podUID="caf83930-da53-4f66-a458-a6f544ec7096" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.510942 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.510911 2570 status_manager.go:895] "Failed to get status for pod" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.512957 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.512927 2570 status_manager.go:895] "Failed to get status for pod" podUID="caf83930-da53-4f66-a458-a6f544ec7096" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.514918 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.514887 2570 status_manager.go:895] "Failed to get status for pod" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.520726 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.520703 2570 scope.go:117] "RemoveContainer" containerID="7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f" Apr 20 19:34:13.521161 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:34:13.521135 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f\": container with ID starting with 7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f not found: ID does not exist" containerID="7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f" Apr 20 19:34:13.521219 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.521173 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f"} err="failed to get container status \"7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f\": rpc error: code = NotFound desc = could not find container \"7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f\": container with ID starting with 7b0778b181f2eb331255128004863f4425aae1e4d9f7c3dcb725a06b72af623f not found: ID does not exist" Apr 20 19:34:13.523327 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.523293 2570 status_manager.go:895] "Failed to get status for pod" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-5wqll" err="pods \"kuadrant-operator-controller-manager-5f895dd7d5-5wqll\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.525193 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.525170 2570 status_manager.go:895] "Failed to get status for pod" podUID="caf83930-da53-4f66-a458-a6f544ec7096" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:13.525470 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.525449 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kxql2\" (UniqueName: \"kubernetes.io/projected/8d939ab9-0297-4281-8007-c6811b44dba2-kube-api-access-kxql2\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:13.525523 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.525480 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/8d939ab9-0297-4281-8007-c6811b44dba2-extensions-socket-volume\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:13.625948 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.625847 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx"] Apr 20 19:34:13.629104 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:34:13.629073 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1658cae7_6dd2_4555_8304_e042094d37c1.slice/crio-4da7318ccd165b9387ffcb0c3206db2da190d49d7ebcb21d96a0684ab6d47181 WatchSource:0}: Error finding container 4da7318ccd165b9387ffcb0c3206db2da190d49d7ebcb21d96a0684ab6d47181: Status 404 returned error can't find the container with id 4da7318ccd165b9387ffcb0c3206db2da190d49d7ebcb21d96a0684ab6d47181 Apr 20 19:34:13.650294 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.650270 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t"] Apr 20 19:34:13.652764 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:34:13.652739 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1240c0be_5889_49ea_a1a8_a98e6f6ceaea.slice/crio-ad8faa95468ec6e811142f71890d0b3552fc5f01901110b8bf3e86280108ba4e WatchSource:0}: Error finding container ad8faa95468ec6e811142f71890d0b3552fc5f01901110b8bf3e86280108ba4e: Status 404 returned error can't find the container with id ad8faa95468ec6e811142f71890d0b3552fc5f01901110b8bf3e86280108ba4e Apr 20 19:34:13.963651 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:13.963620 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d939ab9-0297-4281-8007-c6811b44dba2" path="/var/lib/kubelet/pods/8d939ab9-0297-4281-8007-c6811b44dba2/volumes" Apr 20 19:34:14.512297 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.512261 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" podUID="caf83930-da53-4f66-a458-a6f544ec7096" containerName="manager" containerID="cri-o://fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2" gracePeriod=2 Apr 20 19:34:14.513672 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.513644 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" event={"ID":"1240c0be-5889-49ea-a1a8-a98e6f6ceaea","Type":"ContainerStarted","Data":"51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2"} Apr 20 19:34:14.513786 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.513676 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" event={"ID":"1240c0be-5889-49ea-a1a8-a98e6f6ceaea","Type":"ContainerStarted","Data":"ad8faa95468ec6e811142f71890d0b3552fc5f01901110b8bf3e86280108ba4e"} Apr 20 19:34:14.513786 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.513706 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:14.515887 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.515864 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" event={"ID":"1658cae7-6dd2-4555-8304-e042094d37c1","Type":"ContainerStarted","Data":"4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a"} Apr 20 19:34:14.515887 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.515889 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" event={"ID":"1658cae7-6dd2-4555-8304-e042094d37c1","Type":"ContainerStarted","Data":"4da7318ccd165b9387ffcb0c3206db2da190d49d7ebcb21d96a0684ab6d47181"} Apr 20 19:34:14.516033 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.515995 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:14.542349 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.542300 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" podStartSLOduration=1.542285293 podStartE2EDuration="1.542285293s" podCreationTimestamp="2026-04-20 19:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:34:14.541021997 +0000 UTC m=+537.233030948" watchObservedRunningTime="2026-04-20 19:34:14.542285293 +0000 UTC m=+537.234294228" Apr 20 19:34:14.574439 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.574389 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" podStartSLOduration=1.57437286 podStartE2EDuration="1.57437286s" podCreationTimestamp="2026-04-20 19:34:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:34:14.573268879 +0000 UTC m=+537.265277818" watchObservedRunningTime="2026-04-20 19:34:14.57437286 +0000 UTC m=+537.266381799" Apr 20 19:34:14.745347 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.745324 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:14.748009 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.747986 2570 status_manager.go:895] "Failed to get status for pod" podUID="caf83930-da53-4f66-a458-a6f544ec7096" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:14.838317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.838236 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xj2lr\" (UniqueName: \"kubernetes.io/projected/caf83930-da53-4f66-a458-a6f544ec7096-kube-api-access-xj2lr\") pod \"caf83930-da53-4f66-a458-a6f544ec7096\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " Apr 20 19:34:14.838317 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.838309 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/caf83930-da53-4f66-a458-a6f544ec7096-extensions-socket-volume\") pod \"caf83930-da53-4f66-a458-a6f544ec7096\" (UID: \"caf83930-da53-4f66-a458-a6f544ec7096\") " Apr 20 19:34:14.838599 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.838553 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caf83930-da53-4f66-a458-a6f544ec7096-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "caf83930-da53-4f66-a458-a6f544ec7096" (UID: "caf83930-da53-4f66-a458-a6f544ec7096"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:34:14.840422 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.840389 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf83930-da53-4f66-a458-a6f544ec7096-kube-api-access-xj2lr" (OuterVolumeSpecName: "kube-api-access-xj2lr") pod "caf83930-da53-4f66-a458-a6f544ec7096" (UID: "caf83930-da53-4f66-a458-a6f544ec7096"). InnerVolumeSpecName "kube-api-access-xj2lr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:14.939684 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.939647 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xj2lr\" (UniqueName: \"kubernetes.io/projected/caf83930-da53-4f66-a458-a6f544ec7096-kube-api-access-xj2lr\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:14.939684 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:14.939677 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/caf83930-da53-4f66-a458-a6f544ec7096-extensions-socket-volume\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:15.520257 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.520222 2570 generic.go:358] "Generic (PLEG): container finished" podID="caf83930-da53-4f66-a458-a6f544ec7096" containerID="fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2" exitCode=2 Apr 20 19:34:15.520432 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.520268 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" Apr 20 19:34:15.520432 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.520319 2570 scope.go:117] "RemoveContainer" containerID="fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2" Apr 20 19:34:15.522769 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.522733 2570 status_manager.go:895] "Failed to get status for pod" podUID="caf83930-da53-4f66-a458-a6f544ec7096" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:15.528686 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.528662 2570 scope.go:117] "RemoveContainer" containerID="fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2" Apr 20 19:34:15.528938 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:34:15.528921 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2\": container with ID starting with fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2 not found: ID does not exist" containerID="fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2" Apr 20 19:34:15.528999 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.528947 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2"} err="failed to get container status \"fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2\": rpc error: code = NotFound desc = could not find container \"fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2\": container with ID starting with fb57ad8a572b621f5efe18a65614b7930433f894e598d56f0248c8eab42ef0d2 not found: ID does not exist" Apr 20 19:34:15.530775 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.530754 2570 status_manager.go:895] "Failed to get status for pod" podUID="caf83930-da53-4f66-a458-a6f544ec7096" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t" err="pods \"kuadrant-operator-controller-manager-6bc9f4c76f-lfc9t\" is forbidden: User \"system:node:ip-10-0-132-159.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-132-159.ec2.internal' and this object" Apr 20 19:34:15.963104 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:15.963075 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf83930-da53-4f66-a458-a6f544ec7096" path="/var/lib/kubelet/pods/caf83930-da53-4f66-a458-a6f544ec7096/volumes" Apr 20 19:34:25.522507 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.522473 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:25.522899 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.522532 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:25.580522 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.580486 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx"] Apr 20 19:34:25.580790 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.580764 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" podUID="1658cae7-6dd2-4555-8304-e042094d37c1" containerName="manager" containerID="cri-o://4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a" gracePeriod=10 Apr 20 19:34:25.826799 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.826776 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:25.840844 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.840811 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz"] Apr 20 19:34:25.841237 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.841219 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1658cae7-6dd2-4555-8304-e042094d37c1" containerName="manager" Apr 20 19:34:25.841237 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.841239 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1658cae7-6dd2-4555-8304-e042094d37c1" containerName="manager" Apr 20 19:34:25.841385 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.841311 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1658cae7-6dd2-4555-8304-e042094d37c1" containerName="manager" Apr 20 19:34:25.844252 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.844234 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:25.856709 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.856685 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz"] Apr 20 19:34:25.931107 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.931079 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbpgm\" (UniqueName: \"kubernetes.io/projected/1658cae7-6dd2-4555-8304-e042094d37c1-kube-api-access-mbpgm\") pod \"1658cae7-6dd2-4555-8304-e042094d37c1\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " Apr 20 19:34:25.931275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.931126 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1658cae7-6dd2-4555-8304-e042094d37c1-extensions-socket-volume\") pod \"1658cae7-6dd2-4555-8304-e042094d37c1\" (UID: \"1658cae7-6dd2-4555-8304-e042094d37c1\") " Apr 20 19:34:25.931318 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.931279 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270927b8-f48c-4d95-aaca-2f6dbf1a7a7a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-c6zjz\" (UID: \"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:25.931362 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.931327 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hhs2q\" (UniqueName: \"kubernetes.io/projected/270927b8-f48c-4d95-aaca-2f6dbf1a7a7a-kube-api-access-hhs2q\") pod \"kuadrant-operator-controller-manager-55c7f4c975-c6zjz\" (UID: \"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:25.931572 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.931539 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1658cae7-6dd2-4555-8304-e042094d37c1-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1658cae7-6dd2-4555-8304-e042094d37c1" (UID: "1658cae7-6dd2-4555-8304-e042094d37c1"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:34:25.933136 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:25.933114 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1658cae7-6dd2-4555-8304-e042094d37c1-kube-api-access-mbpgm" (OuterVolumeSpecName: "kube-api-access-mbpgm") pod "1658cae7-6dd2-4555-8304-e042094d37c1" (UID: "1658cae7-6dd2-4555-8304-e042094d37c1"). InnerVolumeSpecName "kube-api-access-mbpgm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:26.032248 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.032213 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hhs2q\" (UniqueName: \"kubernetes.io/projected/270927b8-f48c-4d95-aaca-2f6dbf1a7a7a-kube-api-access-hhs2q\") pod \"kuadrant-operator-controller-manager-55c7f4c975-c6zjz\" (UID: \"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:26.032411 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.032315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270927b8-f48c-4d95-aaca-2f6dbf1a7a7a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-c6zjz\" (UID: \"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:26.032411 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.032363 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mbpgm\" (UniqueName: \"kubernetes.io/projected/1658cae7-6dd2-4555-8304-e042094d37c1-kube-api-access-mbpgm\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.032411 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.032377 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1658cae7-6dd2-4555-8304-e042094d37c1-extensions-socket-volume\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:26.032789 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.032765 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/270927b8-f48c-4d95-aaca-2f6dbf1a7a7a-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-55c7f4c975-c6zjz\" (UID: \"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:26.041597 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.041549 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hhs2q\" (UniqueName: \"kubernetes.io/projected/270927b8-f48c-4d95-aaca-2f6dbf1a7a7a-kube-api-access-hhs2q\") pod \"kuadrant-operator-controller-manager-55c7f4c975-c6zjz\" (UID: \"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a\") " pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:26.153351 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.153257 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:26.278170 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.278147 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz"] Apr 20 19:34:26.280121 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:34:26.280094 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod270927b8_f48c_4d95_aaca_2f6dbf1a7a7a.slice/crio-d85bd2e8115e58ad60b5095489cbc18e746fbfb087145dc61ed9a663884a5e85 WatchSource:0}: Error finding container d85bd2e8115e58ad60b5095489cbc18e746fbfb087145dc61ed9a663884a5e85: Status 404 returned error can't find the container with id d85bd2e8115e58ad60b5095489cbc18e746fbfb087145dc61ed9a663884a5e85 Apr 20 19:34:26.562672 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.562630 2570 generic.go:358] "Generic (PLEG): container finished" podID="1658cae7-6dd2-4555-8304-e042094d37c1" containerID="4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a" exitCode=0 Apr 20 19:34:26.563150 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.562709 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" event={"ID":"1658cae7-6dd2-4555-8304-e042094d37c1","Type":"ContainerDied","Data":"4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a"} Apr 20 19:34:26.563150 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.562733 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" Apr 20 19:34:26.563150 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.562759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx" event={"ID":"1658cae7-6dd2-4555-8304-e042094d37c1","Type":"ContainerDied","Data":"4da7318ccd165b9387ffcb0c3206db2da190d49d7ebcb21d96a0684ab6d47181"} Apr 20 19:34:26.563150 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.562782 2570 scope.go:117] "RemoveContainer" containerID="4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a" Apr 20 19:34:26.564367 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.564340 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" event={"ID":"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a","Type":"ContainerStarted","Data":"32c4f18ea28f632d76aed84b0e5c5a63b14ee1ca8831d11d35817dee06a7f2cc"} Apr 20 19:34:26.564453 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.564373 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" event={"ID":"270927b8-f48c-4d95-aaca-2f6dbf1a7a7a","Type":"ContainerStarted","Data":"d85bd2e8115e58ad60b5095489cbc18e746fbfb087145dc61ed9a663884a5e85"} Apr 20 19:34:26.564513 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.564487 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:26.570622 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.570605 2570 scope.go:117] "RemoveContainer" containerID="4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a" Apr 20 19:34:26.570865 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:34:26.570847 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a\": container with ID starting with 4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a not found: ID does not exist" containerID="4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a" Apr 20 19:34:26.570914 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.570872 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a"} err="failed to get container status \"4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a\": rpc error: code = NotFound desc = could not find container \"4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a\": container with ID starting with 4266e412da70ea78f8535ed2e7766b420cc2b35d4242e9f1e15fd0c924e4c04a not found: ID does not exist" Apr 20 19:34:26.599099 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.599058 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx"] Apr 20 19:34:26.604274 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.604251 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-5f895dd7d5-bnmmx"] Apr 20 19:34:26.638710 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:26.634551 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" podStartSLOduration=1.6345350920000001 podStartE2EDuration="1.634535092s" podCreationTimestamp="2026-04-20 19:34:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:34:26.632371952 +0000 UTC m=+549.324380891" watchObservedRunningTime="2026-04-20 19:34:26.634535092 +0000 UTC m=+549.326544033" Apr 20 19:34:27.962636 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:27.962603 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1658cae7-6dd2-4555-8304-e042094d37c1" path="/var/lib/kubelet/pods/1658cae7-6dd2-4555-8304-e042094d37c1/volumes" Apr 20 19:34:37.571473 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.571439 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-55c7f4c975-c6zjz" Apr 20 19:34:37.622669 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.622636 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t"] Apr 20 19:34:37.622938 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.622893 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" podUID="1240c0be-5889-49ea-a1a8-a98e6f6ceaea" containerName="manager" containerID="cri-o://51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2" gracePeriod=10 Apr 20 19:34:37.857973 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.857950 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:37.930019 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.929965 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-extensions-socket-volume\") pod \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " Apr 20 19:34:37.930019 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.930024 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7hpq7\" (UniqueName: \"kubernetes.io/projected/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-kube-api-access-7hpq7\") pod \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\" (UID: \"1240c0be-5889-49ea-a1a8-a98e6f6ceaea\") " Apr 20 19:34:37.930398 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.930370 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "1240c0be-5889-49ea-a1a8-a98e6f6ceaea" (UID: "1240c0be-5889-49ea-a1a8-a98e6f6ceaea"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 19:34:37.932028 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:37.932003 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-kube-api-access-7hpq7" (OuterVolumeSpecName: "kube-api-access-7hpq7") pod "1240c0be-5889-49ea-a1a8-a98e6f6ceaea" (UID: "1240c0be-5889-49ea-a1a8-a98e6f6ceaea"). InnerVolumeSpecName "kube-api-access-7hpq7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:34:38.031505 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.031474 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7hpq7\" (UniqueName: \"kubernetes.io/projected/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-kube-api-access-7hpq7\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:38.031505 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.031498 2570 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/1240c0be-5889-49ea-a1a8-a98e6f6ceaea-extensions-socket-volume\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:34:38.609943 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.609909 2570 generic.go:358] "Generic (PLEG): container finished" podID="1240c0be-5889-49ea-a1a8-a98e6f6ceaea" containerID="51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2" exitCode=0 Apr 20 19:34:38.610335 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.609971 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" Apr 20 19:34:38.610335 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.610001 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" event={"ID":"1240c0be-5889-49ea-a1a8-a98e6f6ceaea","Type":"ContainerDied","Data":"51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2"} Apr 20 19:34:38.610335 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.610042 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t" event={"ID":"1240c0be-5889-49ea-a1a8-a98e6f6ceaea","Type":"ContainerDied","Data":"ad8faa95468ec6e811142f71890d0b3552fc5f01901110b8bf3e86280108ba4e"} Apr 20 19:34:38.610335 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.610058 2570 scope.go:117] "RemoveContainer" containerID="51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2" Apr 20 19:34:38.617705 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.617691 2570 scope.go:117] "RemoveContainer" containerID="51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2" Apr 20 19:34:38.618004 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:34:38.617973 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2\": container with ID starting with 51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2 not found: ID does not exist" containerID="51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2" Apr 20 19:34:38.618109 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.618009 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2"} err="failed to get container status \"51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2\": rpc error: code = NotFound desc = could not find container \"51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2\": container with ID starting with 51c254ffa683ac21945ea3f1ebaeecae06f3b3839444e6417e93f176e11993b2 not found: ID does not exist" Apr 20 19:34:38.630856 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.630831 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t"] Apr 20 19:34:38.634479 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:38.634453 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6bc9f4c76f-kdk5t"] Apr 20 19:34:39.963308 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:39.963260 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1240c0be-5889-49ea-a1a8-a98e6f6ceaea" path="/var/lib/kubelet/pods/1240c0be-5889-49ea-a1a8-a98e6f6ceaea/volumes" Apr 20 19:34:55.850034 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.849997 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:34:55.850461 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.850323 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1240c0be-5889-49ea-a1a8-a98e6f6ceaea" containerName="manager" Apr 20 19:34:55.850461 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.850339 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1240c0be-5889-49ea-a1a8-a98e6f6ceaea" containerName="manager" Apr 20 19:34:55.850461 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.850414 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1240c0be-5889-49ea-a1a8-a98e6f6ceaea" containerName="manager" Apr 20 19:34:55.854684 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.854665 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:55.857191 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.857171 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-s2h9h\"" Apr 20 19:34:55.857310 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.857247 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 19:34:55.859932 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.859912 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:34:55.947036 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.947000 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:34:55.979442 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.979410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-config-file\") pod \"limitador-limitador-7d549b5b-544db\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:55.979442 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:55.979443 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bmt5\" (UniqueName: \"kubernetes.io/projected/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-kube-api-access-5bmt5\") pod \"limitador-limitador-7d549b5b-544db\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:56.080839 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.080778 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-config-file\") pod \"limitador-limitador-7d549b5b-544db\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:56.081031 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.080903 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bmt5\" (UniqueName: \"kubernetes.io/projected/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-kube-api-access-5bmt5\") pod \"limitador-limitador-7d549b5b-544db\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:56.081461 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.081444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-config-file\") pod \"limitador-limitador-7d549b5b-544db\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:56.089017 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.088994 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bmt5\" (UniqueName: \"kubernetes.io/projected/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-kube-api-access-5bmt5\") pod \"limitador-limitador-7d549b5b-544db\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:56.164940 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.164854 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:56.305127 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.305104 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:34:56.307632 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:34:56.307597 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97a95ae8_41f9_4caa_bb93_2a49b3d346c1.slice/crio-6a53cfe68aabec1573240fcd0eb6b77437d9cf4e2b247b90ed7db197473141c8 WatchSource:0}: Error finding container 6a53cfe68aabec1573240fcd0eb6b77437d9cf4e2b247b90ed7db197473141c8: Status 404 returned error can't find the container with id 6a53cfe68aabec1573240fcd0eb6b77437d9cf4e2b247b90ed7db197473141c8 Apr 20 19:34:56.673897 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:56.673864 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" event={"ID":"97a95ae8-41f9-4caa-bb93-2a49b3d346c1","Type":"ContainerStarted","Data":"6a53cfe68aabec1573240fcd0eb6b77437d9cf4e2b247b90ed7db197473141c8"} Apr 20 19:34:59.685466 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:59.685421 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" event={"ID":"97a95ae8-41f9-4caa-bb93-2a49b3d346c1","Type":"ContainerStarted","Data":"64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597"} Apr 20 19:34:59.685889 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:59.685681 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:34:59.701881 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:34:59.701831 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" podStartSLOduration=1.9985640980000001 podStartE2EDuration="4.701817255s" podCreationTimestamp="2026-04-20 19:34:55 +0000 UTC" firstStartedPulling="2026-04-20 19:34:56.309414448 +0000 UTC m=+579.001423380" lastFinishedPulling="2026-04-20 19:34:59.012667611 +0000 UTC m=+581.704676537" observedRunningTime="2026-04-20 19:34:59.700041336 +0000 UTC m=+582.392050275" watchObservedRunningTime="2026-04-20 19:34:59.701817255 +0000 UTC m=+582.393826236" Apr 20 19:35:10.690328 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:10.690300 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:35:12.988347 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:12.988313 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:35:12.988760 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:12.988525 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" podUID="97a95ae8-41f9-4caa-bb93-2a49b3d346c1" containerName="limitador" containerID="cri-o://64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597" gracePeriod=30 Apr 20 19:35:13.537399 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.537371 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:35:13.618011 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.617928 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5bmt5\" (UniqueName: \"kubernetes.io/projected/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-kube-api-access-5bmt5\") pod \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " Apr 20 19:35:13.618011 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.617967 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-config-file\") pod \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\" (UID: \"97a95ae8-41f9-4caa-bb93-2a49b3d346c1\") " Apr 20 19:35:13.623497 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.618590 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-config-file" (OuterVolumeSpecName: "config-file") pod "97a95ae8-41f9-4caa-bb93-2a49b3d346c1" (UID: "97a95ae8-41f9-4caa-bb93-2a49b3d346c1"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 19:35:13.623497 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.620993 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-kube-api-access-5bmt5" (OuterVolumeSpecName: "kube-api-access-5bmt5") pod "97a95ae8-41f9-4caa-bb93-2a49b3d346c1" (UID: "97a95ae8-41f9-4caa-bb93-2a49b3d346c1"). InnerVolumeSpecName "kube-api-access-5bmt5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 19:35:13.719145 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.719107 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5bmt5\" (UniqueName: \"kubernetes.io/projected/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-kube-api-access-5bmt5\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:35:13.719145 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.719140 2570 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/97a95ae8-41f9-4caa-bb93-2a49b3d346c1-config-file\") on node \"ip-10-0-132-159.ec2.internal\" DevicePath \"\"" Apr 20 19:35:13.728710 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.728677 2570 generic.go:358] "Generic (PLEG): container finished" podID="97a95ae8-41f9-4caa-bb93-2a49b3d346c1" containerID="64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597" exitCode=0 Apr 20 19:35:13.728847 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.728769 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" event={"ID":"97a95ae8-41f9-4caa-bb93-2a49b3d346c1","Type":"ContainerDied","Data":"64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597"} Apr 20 19:35:13.728847 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.728809 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" event={"ID":"97a95ae8-41f9-4caa-bb93-2a49b3d346c1","Type":"ContainerDied","Data":"6a53cfe68aabec1573240fcd0eb6b77437d9cf4e2b247b90ed7db197473141c8"} Apr 20 19:35:13.728847 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.728825 2570 scope.go:117] "RemoveContainer" containerID="64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597" Apr 20 19:35:13.728958 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.728780 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-544db" Apr 20 19:35:13.736576 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.736538 2570 scope.go:117] "RemoveContainer" containerID="64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597" Apr 20 19:35:13.736841 ip-10-0-132-159 kubenswrapper[2570]: E0420 19:35:13.736824 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597\": container with ID starting with 64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597 not found: ID does not exist" containerID="64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597" Apr 20 19:35:13.736899 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.736849 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597"} err="failed to get container status \"64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597\": rpc error: code = NotFound desc = could not find container \"64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597\": container with ID starting with 64816909123e07d1ff3f63bb2b0bf6ef69476ca7056f9acd7ba1c021dfe92597 not found: ID does not exist" Apr 20 19:35:13.749099 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.749068 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:35:13.752404 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.752384 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-544db"] Apr 20 19:35:13.963367 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:13.963337 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97a95ae8-41f9-4caa-bb93-2a49b3d346c1" path="/var/lib/kubelet/pods/97a95ae8-41f9-4caa-bb93-2a49b3d346c1/volumes" Apr 20 19:35:17.315955 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.315921 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-xw6mb"] Apr 20 19:35:17.316453 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.316387 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="97a95ae8-41f9-4caa-bb93-2a49b3d346c1" containerName="limitador" Apr 20 19:35:17.316453 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.316406 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="97a95ae8-41f9-4caa-bb93-2a49b3d346c1" containerName="limitador" Apr 20 19:35:17.316598 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.316495 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="97a95ae8-41f9-4caa-bb93-2a49b3d346c1" containerName="limitador" Apr 20 19:35:17.320851 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.320830 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.323497 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.323378 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 19:35:17.323497 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.323462 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-dstl6\"" Apr 20 19:35:17.326144 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.326122 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-xw6mb"] Apr 20 19:35:17.346341 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.346311 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95-data\") pod \"postgres-868db5846d-xw6mb\" (UID: \"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95\") " pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.346485 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.346352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5kq7s\" (UniqueName: \"kubernetes.io/projected/fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95-kube-api-access-5kq7s\") pod \"postgres-868db5846d-xw6mb\" (UID: \"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95\") " pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.447701 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.447664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95-data\") pod \"postgres-868db5846d-xw6mb\" (UID: \"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95\") " pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.447872 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.447709 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5kq7s\" (UniqueName: \"kubernetes.io/projected/fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95-kube-api-access-5kq7s\") pod \"postgres-868db5846d-xw6mb\" (UID: \"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95\") " pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.448046 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.448025 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95-data\") pod \"postgres-868db5846d-xw6mb\" (UID: \"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95\") " pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.456019 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.455995 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5kq7s\" (UniqueName: \"kubernetes.io/projected/fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95-kube-api-access-5kq7s\") pod \"postgres-868db5846d-xw6mb\" (UID: \"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95\") " pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.632406 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.632314 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:17.748805 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.748782 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-xw6mb"] Apr 20 19:35:17.750838 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:35:17.750810 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc0225ff_0e0c_4c48_84f9_7d4bf8aaac95.slice/crio-7f7c3ddc21e265f99566ec26c0f5d304c1f8d24a1ef8a80bef280b8e18062fc9 WatchSource:0}: Error finding container 7f7c3ddc21e265f99566ec26c0f5d304c1f8d24a1ef8a80bef280b8e18062fc9: Status 404 returned error can't find the container with id 7f7c3ddc21e265f99566ec26c0f5d304c1f8d24a1ef8a80bef280b8e18062fc9 Apr 20 19:35:17.911600 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.911510 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:35:17.911904 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:17.911886 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:35:18.745947 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:18.745909 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-xw6mb" event={"ID":"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95","Type":"ContainerStarted","Data":"7f7c3ddc21e265f99566ec26c0f5d304c1f8d24a1ef8a80bef280b8e18062fc9"} Apr 20 19:35:23.133858 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:23.133834 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 20 19:35:23.767030 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:23.766986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-xw6mb" event={"ID":"fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95","Type":"ContainerStarted","Data":"0178d7e1fc8c5f1511e2ef96e59313d3c51e8082f6e5597190e570018d7e2dae"} Apr 20 19:35:23.767235 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:23.767093 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:23.783713 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:23.783663 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-xw6mb" podStartSLOduration=1.404460382 podStartE2EDuration="6.783648998s" podCreationTimestamp="2026-04-20 19:35:17 +0000 UTC" firstStartedPulling="2026-04-20 19:35:17.751884297 +0000 UTC m=+600.443893218" lastFinishedPulling="2026-04-20 19:35:23.131072917 +0000 UTC m=+605.823081834" observedRunningTime="2026-04-20 19:35:23.782156728 +0000 UTC m=+606.474165678" watchObservedRunningTime="2026-04-20 19:35:23.783648998 +0000 UTC m=+606.475657938" Apr 20 19:35:29.798269 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:29.798192 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-xw6mb" Apr 20 19:35:38.734455 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.734414 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-k8x2w"] Apr 20 19:35:38.747890 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.747861 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-k8x2w"] Apr 20 19:35:38.748038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.747965 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" Apr 20 19:35:38.750835 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.750813 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"kube-root-ca.crt\"" Apr 20 19:35:38.751839 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.751812 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"keycloak-system\"/\"openshift-service-ca.crt\"" Apr 20 19:35:38.751839 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.751828 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"keycloak-system\"/\"keycloak-operator-dockercfg-njgm2\"" Apr 20 19:35:38.815506 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.815474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gsk4\" (UniqueName: \"kubernetes.io/projected/fe1f796c-4531-47ca-8672-abc2fbb90fc4-kube-api-access-9gsk4\") pod \"keycloak-operator-5c4df598dd-k8x2w\" (UID: \"fe1f796c-4531-47ca-8672-abc2fbb90fc4\") " pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" Apr 20 19:35:38.916275 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.916239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9gsk4\" (UniqueName: \"kubernetes.io/projected/fe1f796c-4531-47ca-8672-abc2fbb90fc4-kube-api-access-9gsk4\") pod \"keycloak-operator-5c4df598dd-k8x2w\" (UID: \"fe1f796c-4531-47ca-8672-abc2fbb90fc4\") " pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" Apr 20 19:35:38.925283 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:38.925251 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gsk4\" (UniqueName: \"kubernetes.io/projected/fe1f796c-4531-47ca-8672-abc2fbb90fc4-kube-api-access-9gsk4\") pod \"keycloak-operator-5c4df598dd-k8x2w\" (UID: \"fe1f796c-4531-47ca-8672-abc2fbb90fc4\") " pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" Apr 20 19:35:39.058305 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:39.058214 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" Apr 20 19:35:39.177333 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:39.177302 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["keycloak-system/keycloak-operator-5c4df598dd-k8x2w"] Apr 20 19:35:39.180288 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:35:39.180263 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfe1f796c_4531_47ca_8672_abc2fbb90fc4.slice/crio-2f16079c6221ad951706794ef3e7cd357ae36d9001d25400ffee091f623a2f45 WatchSource:0}: Error finding container 2f16079c6221ad951706794ef3e7cd357ae36d9001d25400ffee091f623a2f45: Status 404 returned error can't find the container with id 2f16079c6221ad951706794ef3e7cd357ae36d9001d25400ffee091f623a2f45 Apr 20 19:35:39.825552 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:39.825510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" event={"ID":"fe1f796c-4531-47ca-8672-abc2fbb90fc4","Type":"ContainerStarted","Data":"2f16079c6221ad951706794ef3e7cd357ae36d9001d25400ffee091f623a2f45"} Apr 20 19:35:45.847968 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:45.847926 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" event={"ID":"fe1f796c-4531-47ca-8672-abc2fbb90fc4","Type":"ContainerStarted","Data":"c848fe601aef5a8c9f3c648b345e9ffa91e9b3b879845ee4189e6f5635470e54"} Apr 20 19:35:45.864874 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:35:45.864829 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="keycloak-system/keycloak-operator-5c4df598dd-k8x2w" podStartSLOduration=1.986546685 podStartE2EDuration="7.864815936s" podCreationTimestamp="2026-04-20 19:35:38 +0000 UTC" firstStartedPulling="2026-04-20 19:35:39.181863604 +0000 UTC m=+621.873872521" lastFinishedPulling="2026-04-20 19:35:45.060132851 +0000 UTC m=+627.752141772" observedRunningTime="2026-04-20 19:35:45.862244673 +0000 UTC m=+628.554253612" watchObservedRunningTime="2026-04-20 19:35:45.864815936 +0000 UTC m=+628.556824875" Apr 20 19:39:52.898195 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:52.898149 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-rkv9n_c72c7707-c35d-4ef8-b575-656cc958dc1c/manager/0.log" Apr 20 19:39:53.256424 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:53.256393 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-q5v8h_7dcae940-547e-4ab6-bcf6-681672895e3e/manager/2.log" Apr 20 19:39:53.490499 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:53.490468 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9f747d685-4dr77_eaded8f8-1e1e-450b-96ba-62ad59fcfd88/manager/0.log" Apr 20 19:39:53.712065 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:53.712034 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-xw6mb_fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95/postgres/0.log" Apr 20 19:39:55.526246 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:55.526212 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-c6zjz_270927b8-f48c-4d95-aaca-2f6dbf1a7a7a/manager/0.log" Apr 20 19:39:56.192905 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:56.192831 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zdbjz_8d1b4965-d776-4148-a045-ff9e9d1ac69f/discovery/0.log" Apr 20 19:39:56.401068 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:39:56.401040 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f76cb8cc-vj77z_f3399bac-03c7-47b5-8e0f-701d3d618016/kube-auth-proxy/0.log" Apr 20 19:40:04.393542 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:04.393505 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9d99g_e48bd3bb-a360-42e2-bee7-064799310567/global-pull-secret-syncer/0.log" Apr 20 19:40:04.553610 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:04.553576 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-s2wnw_7b91853c-4ad1-4d29-b26c-3742343d8630/konnectivity-agent/0.log" Apr 20 19:40:04.604742 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:04.604711 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-159.ec2.internal_baec555e5e2b442b2cad3d99698ce3db/haproxy/0.log" Apr 20 19:40:09.566869 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:09.566841 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-55c7f4c975-c6zjz_270927b8-f48c-4d95-aaca-2f6dbf1a7a7a/manager/0.log" Apr 20 19:40:11.670394 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:11.670359 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jp456_20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e/node-exporter/0.log" Apr 20 19:40:11.698325 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:11.698293 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jp456_20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e/kube-rbac-proxy/0.log" Apr 20 19:40:11.724801 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:11.724772 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jp456_20e8e9b6-e7da-4dc2-ae87-ecd10feafb2e/init-textfile/0.log" Apr 20 19:40:13.261407 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.261369 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz"] Apr 20 19:40:13.264899 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.264875 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.267751 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.267726 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qqfns\"/\"kube-root-ca.crt\"" Apr 20 19:40:13.268602 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.268580 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-qqfns\"/\"default-dockercfg-gfb8d\"" Apr 20 19:40:13.268712 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.268581 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-qqfns\"/\"openshift-service-ca.crt\"" Apr 20 19:40:13.273387 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.273360 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz"] Apr 20 19:40:13.381816 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.381778 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-sys\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.381816 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.381817 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-proc\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.382038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.381837 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-podres\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.382038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.381942 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wj6fh\" (UniqueName: \"kubernetes.io/projected/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-kube-api-access-wj6fh\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.382038 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.382028 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-lib-modules\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483371 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483323 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-lib-modules\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483381 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-sys\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483410 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-proc\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-podres\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483472 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wj6fh\" (UniqueName: \"kubernetes.io/projected/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-kube-api-access-wj6fh\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-lib-modules\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483545 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-proc\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483618 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483531 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-sys\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.483951 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.483618 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-podres\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.493196 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.493166 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wj6fh\" (UniqueName: \"kubernetes.io/projected/8f3df3d3-d7a9-4222-9059-4f04def3e9aa-kube-api-access-wj6fh\") pod \"perf-node-gather-daemonset-x74rz\" (UID: \"8f3df3d3-d7a9-4222-9059-4f04def3e9aa\") " pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.553139 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.553054 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-v8fl7_667e99fd-c507-4e05-a425-bda15ee82168/networking-console-plugin/0.log" Apr 20 19:40:13.575776 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.575744 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:13.707251 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.707104 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz"] Apr 20 19:40:13.709486 ip-10-0-132-159 kubenswrapper[2570]: W0420 19:40:13.709455 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8f3df3d3_d7a9_4222_9059_4f04def3e9aa.slice/crio-58de9a2e60149f29e39c2aee9a9a7f469938ed79b1f15cf3e1e3a5cd3db8f06b WatchSource:0}: Error finding container 58de9a2e60149f29e39c2aee9a9a7f469938ed79b1f15cf3e1e3a5cd3db8f06b: Status 404 returned error can't find the container with id 58de9a2e60149f29e39c2aee9a9a7f469938ed79b1f15cf3e1e3a5cd3db8f06b Apr 20 19:40:13.710887 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:13.710860 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 19:40:14.712789 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:14.712754 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" event={"ID":"8f3df3d3-d7a9-4222-9059-4f04def3e9aa","Type":"ContainerStarted","Data":"a309be050c5a4ade77c6934c0b5de33f0731f2531eed2a65b186585dbc1d37b2"} Apr 20 19:40:14.712789 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:14.712798 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" event={"ID":"8f3df3d3-d7a9-4222-9059-4f04def3e9aa","Type":"ContainerStarted","Data":"58de9a2e60149f29e39c2aee9a9a7f469938ed79b1f15cf3e1e3a5cd3db8f06b"} Apr 20 19:40:14.713250 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:14.712885 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:14.731673 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:14.731624 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" podStartSLOduration=1.731609524 podStartE2EDuration="1.731609524s" podCreationTimestamp="2026-04-20 19:40:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 19:40:14.730282855 +0000 UTC m=+897.422291793" watchObservedRunningTime="2026-04-20 19:40:14.731609524 +0000 UTC m=+897.423618465" Apr 20 19:40:16.096684 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:16.096650 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tf6mk_1c8d3c86-b40b-483d-8c21-32ed7bd9f45e/dns/0.log" Apr 20 19:40:16.121515 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:16.121492 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-tf6mk_1c8d3c86-b40b-483d-8c21-32ed7bd9f45e/kube-rbac-proxy/0.log" Apr 20 19:40:16.146511 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:16.146481 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-2b57t_1f14825d-bc77-4c61-9be4-8a25e8c7134b/dns-node-resolver/0.log" Apr 20 19:40:16.764766 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:16.764730 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d2857_d1de0681-7e7f-40b8-aef2-7bbbcf5c25e7/node-ca/0.log" Apr 20 19:40:17.733990 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:17.733958 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zdbjz_8d1b4965-d776-4148-a045-ff9e9d1ac69f/discovery/0.log" Apr 20 19:40:17.782284 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:17.782251 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-79f76cb8cc-vj77z_f3399bac-03c7-47b5-8e0f-701d3d618016/kube-auth-proxy/0.log" Apr 20 19:40:17.931359 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:17.931331 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:40:17.933217 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:17.933188 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-acl-logging/0.log" Apr 20 19:40:18.453150 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:18.453098 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-s4kbs_cc2bb5fd-130e-4be9-a03f-eb9ac877ee23/serve-healthcheck-canary/0.log" Apr 20 19:40:18.973079 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:18.973043 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4zbff_b94a5636-dfff-4c1d-8bb7-149955529401/kube-rbac-proxy/0.log" Apr 20 19:40:19.001486 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:19.001454 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4zbff_b94a5636-dfff-4c1d-8bb7-149955529401/exporter/0.log" Apr 20 19:40:19.024814 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:19.024785 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4zbff_b94a5636-dfff-4c1d-8bb7-149955529401/extractor/0.log" Apr 20 19:40:20.727723 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:20.727697 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-qqfns/perf-node-gather-daemonset-x74rz" Apr 20 19:40:21.201169 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:21.201137 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-rkv9n_c72c7707-c35d-4ef8-b575-656cc958dc1c/manager/0.log" Apr 20 19:40:21.279538 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:21.279505 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-q5v8h_7dcae940-547e-4ab6-bcf6-681672895e3e/manager/1.log" Apr 20 19:40:21.298238 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:21.298204 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-q5v8h_7dcae940-547e-4ab6-bcf6-681672895e3e/manager/2.log" Apr 20 19:40:21.381136 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:21.381099 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-9f747d685-4dr77_eaded8f8-1e1e-450b-96ba-62ad59fcfd88/manager/0.log" Apr 20 19:40:21.432061 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:21.432033 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-xw6mb_fc0225ff-0e0c-4c48-84f9-7d4bf8aaac95/postgres/0.log" Apr 20 19:40:22.980489 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:22.980455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-sbvvl_71df5159-6990-4676-9a0b-3dc288aa9b61/openshift-lws-operator/0.log" Apr 20 19:40:29.111350 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.111321 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/kube-multus-additional-cni-plugins/0.log" Apr 20 19:40:29.135467 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.135441 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/egress-router-binary-copy/0.log" Apr 20 19:40:29.159903 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.159880 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/cni-plugins/0.log" Apr 20 19:40:29.184346 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.184318 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/bond-cni-plugin/0.log" Apr 20 19:40:29.209587 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.209537 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/routeoverride-cni/0.log" Apr 20 19:40:29.234339 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.234313 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/whereabouts-cni-bincopy/0.log" Apr 20 19:40:29.261999 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.261972 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-p5rrh_414b2f51-6741-47fc-9528-f08b5a635fba/whereabouts-cni/0.log" Apr 20 19:40:29.578863 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.578836 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jfcc7_a587ed5a-1c69-439a-a673-9e3e5479ec27/kube-multus/0.log" Apr 20 19:40:29.685364 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.685338 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cvbdc_220904c2-fd82-44e5-9904-33aeca86dcee/network-metrics-daemon/0.log" Apr 20 19:40:29.706335 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:29.706307 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-cvbdc_220904c2-fd82-44e5-9904-33aeca86dcee/kube-rbac-proxy/0.log" Apr 20 19:40:30.906784 ip-10-0-132-159 kubenswrapper[2570]: I0420 19:40:30.906750 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-9rxk8_0481b01c-d63f-4503-aacf-fcdb030d79e9/ovn-controller/0.log"