Apr 19 12:07:05.429534 ip-10-0-140-237 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 19 12:07:05.429546 ip-10-0-140-237 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 19 12:07:05.429555 ip-10-0-140-237 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 19 12:07:05.429868 ip-10-0-140-237 systemd[1]: Failed to start Kubernetes Kubelet. Apr 19 12:07:15.589336 ip-10-0-140-237 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 19 12:07:15.589352 ip-10-0-140-237 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 3d0973cb809a4b4a91135543c99d0382 -- Apr 19 12:09:45.815098 ip-10-0-140-237 systemd[1]: Starting Kubernetes Kubelet... Apr 19 12:09:46.256634 ip-10-0-140-237 kubenswrapper[2572]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:46.256634 ip-10-0-140-237 kubenswrapper[2572]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 19 12:09:46.256634 ip-10-0-140-237 kubenswrapper[2572]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:46.256634 ip-10-0-140-237 kubenswrapper[2572]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 19 12:09:46.256634 ip-10-0-140-237 kubenswrapper[2572]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 19 12:09:46.257910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.257281 2572 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 19 12:09:46.261587 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261565 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:46.261587 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261584 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:46.261587 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261588 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:46.261587 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261592 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:46.261587 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261595 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261598 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261601 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261604 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261607 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261610 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261612 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261615 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261617 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261620 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261623 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261625 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261628 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261631 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261633 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261636 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261639 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261641 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261644 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:46.261794 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261646 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261652 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261655 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261659 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261663 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261666 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261670 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261673 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261676 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261678 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261681 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261683 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261686 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261689 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261693 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261696 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261699 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261702 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261705 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261707 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:46.262251 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261710 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261713 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261716 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261719 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261722 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261724 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261727 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261729 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261732 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261734 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261737 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261740 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261742 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261745 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261747 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261750 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261752 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261768 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261773 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261775 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:46.262784 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261778 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261780 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261783 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261785 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261788 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261790 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261793 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261796 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261799 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261804 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261809 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261813 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261816 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261819 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261822 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261825 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261829 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261831 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261834 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:46.263270 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261837 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261839 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261842 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.261844 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262238 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262242 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262245 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262248 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262251 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262253 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262256 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262259 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262262 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262265 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262268 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262271 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262274 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262276 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262278 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262281 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:46.263737 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262285 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262288 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262291 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262293 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262295 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262298 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262301 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262303 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262306 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262309 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262311 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262316 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262319 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262322 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262325 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262328 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262331 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262333 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262336 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:46.264228 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262338 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262341 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262343 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262346 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262348 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262351 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262353 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262356 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262358 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262361 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262364 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262367 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262370 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262374 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262377 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262379 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262382 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262385 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262387 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262390 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:46.264705 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262393 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262395 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262398 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262401 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262403 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262405 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262409 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262411 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262414 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262417 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262419 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262422 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262424 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262427 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262429 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262432 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262434 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262436 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262439 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262441 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:46.265221 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262444 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262446 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262448 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262451 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262453 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262457 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262459 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262462 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262465 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262467 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.262469 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263834 2572 flags.go:64] FLAG: --address="0.0.0.0" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263845 2572 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263853 2572 flags.go:64] FLAG: --anonymous-auth="true" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263858 2572 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263864 2572 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263867 2572 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263871 2572 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263876 2572 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263879 2572 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263882 2572 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 19 12:09:46.265708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263886 2572 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263889 2572 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263892 2572 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263895 2572 flags.go:64] FLAG: --cgroup-root="" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263898 2572 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263901 2572 flags.go:64] FLAG: --client-ca-file="" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263903 2572 flags.go:64] FLAG: --cloud-config="" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263906 2572 flags.go:64] FLAG: --cloud-provider="external" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263909 2572 flags.go:64] FLAG: --cluster-dns="[]" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263913 2572 flags.go:64] FLAG: --cluster-domain="" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263916 2572 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263919 2572 flags.go:64] FLAG: --config-dir="" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263922 2572 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263926 2572 flags.go:64] FLAG: --container-log-max-files="5" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263929 2572 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263932 2572 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263936 2572 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263939 2572 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263943 2572 flags.go:64] FLAG: --contention-profiling="false" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263946 2572 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263948 2572 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263951 2572 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263954 2572 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263959 2572 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263962 2572 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 19 12:09:46.266243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263964 2572 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263967 2572 flags.go:64] FLAG: --enable-load-reader="false" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263971 2572 flags.go:64] FLAG: --enable-server="true" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263974 2572 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263979 2572 flags.go:64] FLAG: --event-burst="100" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263982 2572 flags.go:64] FLAG: --event-qps="50" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263985 2572 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263988 2572 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263991 2572 flags.go:64] FLAG: --eviction-hard="" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263995 2572 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.263997 2572 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264000 2572 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264003 2572 flags.go:64] FLAG: --eviction-soft="" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264006 2572 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264009 2572 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264012 2572 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264015 2572 flags.go:64] FLAG: --experimental-mounter-path="" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264018 2572 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264022 2572 flags.go:64] FLAG: --fail-swap-on="true" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264024 2572 flags.go:64] FLAG: --feature-gates="" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264029 2572 flags.go:64] FLAG: --file-check-frequency="20s" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264032 2572 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264035 2572 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264039 2572 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264042 2572 flags.go:64] FLAG: --healthz-port="10248" Apr 19 12:09:46.266872 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264045 2572 flags.go:64] FLAG: --help="false" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264048 2572 flags.go:64] FLAG: --hostname-override="ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264051 2572 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264054 2572 flags.go:64] FLAG: --http-check-frequency="20s" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264057 2572 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264060 2572 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264063 2572 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264066 2572 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264069 2572 flags.go:64] FLAG: --image-service-endpoint="" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264072 2572 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264075 2572 flags.go:64] FLAG: --kube-api-burst="100" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264078 2572 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264081 2572 flags.go:64] FLAG: --kube-api-qps="50" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264084 2572 flags.go:64] FLAG: --kube-reserved="" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264087 2572 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264090 2572 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264093 2572 flags.go:64] FLAG: --kubelet-cgroups="" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264096 2572 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264098 2572 flags.go:64] FLAG: --lock-file="" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264101 2572 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264104 2572 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264107 2572 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264113 2572 flags.go:64] FLAG: --log-json-split-stream="false" Apr 19 12:09:46.267511 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264116 2572 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264119 2572 flags.go:64] FLAG: --log-text-split-stream="false" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264122 2572 flags.go:64] FLAG: --logging-format="text" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264125 2572 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264128 2572 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264131 2572 flags.go:64] FLAG: --manifest-url="" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264134 2572 flags.go:64] FLAG: --manifest-url-header="" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264142 2572 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264145 2572 flags.go:64] FLAG: --max-open-files="1000000" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264149 2572 flags.go:64] FLAG: --max-pods="110" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264152 2572 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264155 2572 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264158 2572 flags.go:64] FLAG: --memory-manager-policy="None" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264161 2572 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264164 2572 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264167 2572 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264170 2572 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264177 2572 flags.go:64] FLAG: --node-status-max-images="50" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264180 2572 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264183 2572 flags.go:64] FLAG: --oom-score-adj="-999" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264187 2572 flags.go:64] FLAG: --pod-cidr="" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264190 2572 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264195 2572 flags.go:64] FLAG: --pod-manifest-path="" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264198 2572 flags.go:64] FLAG: --pod-max-pids="-1" Apr 19 12:09:46.268117 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264201 2572 flags.go:64] FLAG: --pods-per-core="0" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264204 2572 flags.go:64] FLAG: --port="10250" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264207 2572 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264210 2572 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-076c09849b9104fd2" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264213 2572 flags.go:64] FLAG: --qos-reserved="" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264217 2572 flags.go:64] FLAG: --read-only-port="10255" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264219 2572 flags.go:64] FLAG: --register-node="true" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264222 2572 flags.go:64] FLAG: --register-schedulable="true" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264225 2572 flags.go:64] FLAG: --register-with-taints="" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264229 2572 flags.go:64] FLAG: --registry-burst="10" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264232 2572 flags.go:64] FLAG: --registry-qps="5" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264235 2572 flags.go:64] FLAG: --reserved-cpus="" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264238 2572 flags.go:64] FLAG: --reserved-memory="" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264242 2572 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264245 2572 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264248 2572 flags.go:64] FLAG: --rotate-certificates="false" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264251 2572 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264254 2572 flags.go:64] FLAG: --runonce="false" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264257 2572 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264260 2572 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264263 2572 flags.go:64] FLAG: --seccomp-default="false" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264266 2572 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264269 2572 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264272 2572 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264275 2572 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264278 2572 flags.go:64] FLAG: --storage-driver-password="root" Apr 19 12:09:46.268675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264281 2572 flags.go:64] FLAG: --storage-driver-secure="false" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264284 2572 flags.go:64] FLAG: --storage-driver-table="stats" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264287 2572 flags.go:64] FLAG: --storage-driver-user="root" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264290 2572 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264294 2572 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264296 2572 flags.go:64] FLAG: --system-cgroups="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264299 2572 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264305 2572 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264308 2572 flags.go:64] FLAG: --tls-cert-file="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264310 2572 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264318 2572 flags.go:64] FLAG: --tls-min-version="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264321 2572 flags.go:64] FLAG: --tls-private-key-file="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264324 2572 flags.go:64] FLAG: --topology-manager-policy="none" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264327 2572 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264330 2572 flags.go:64] FLAG: --topology-manager-scope="container" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264333 2572 flags.go:64] FLAG: --v="2" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264337 2572 flags.go:64] FLAG: --version="false" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264341 2572 flags.go:64] FLAG: --vmodule="" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264346 2572 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.264349 2572 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264443 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264447 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264451 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264454 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:46.269313 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264456 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264459 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264462 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264465 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264468 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264470 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264473 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264475 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264478 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264481 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264483 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264486 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264491 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264494 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264497 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264500 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264503 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264505 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264509 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264512 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:46.269907 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264514 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264517 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264519 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264522 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264525 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264527 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264530 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264532 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264535 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264540 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264542 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264545 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264547 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264550 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264552 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264555 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264558 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264560 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264563 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264565 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:46.270464 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264568 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264570 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264573 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264575 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264578 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264581 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264583 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264586 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264588 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264591 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264595 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264597 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264600 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264603 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264605 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264608 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264610 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264613 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264615 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264618 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:46.270980 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264621 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264628 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264630 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264633 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264635 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264638 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264641 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264643 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264646 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264649 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264651 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264655 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264658 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264661 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264664 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264667 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264669 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264672 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264675 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:46.271499 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264678 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:46.271991 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264681 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:46.271991 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.264683 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:46.271991 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.265187 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:46.272211 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.272193 2572 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 19 12:09:46.272243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.272213 2572 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 19 12:09:46.272283 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272274 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:46.272283 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272283 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272287 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272290 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272293 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272296 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272299 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272301 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272304 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272307 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272310 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272312 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272315 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272317 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272320 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272322 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272325 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272327 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272330 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272332 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272335 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:46.272336 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272337 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272340 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272344 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272348 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272351 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272354 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272357 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272359 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272362 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272365 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272369 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272372 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272375 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272379 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272383 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272386 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272388 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272391 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:46.272854 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272393 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272396 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272399 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272401 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272404 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272407 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272409 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272412 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272414 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272417 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272420 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272422 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272424 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272427 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272430 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272432 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272435 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272439 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272441 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272444 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:46.273293 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272447 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272449 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272452 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272455 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272457 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272460 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272463 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272466 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272468 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272471 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272474 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272477 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272480 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272482 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272485 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272487 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272490 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272492 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272495 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272497 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:46.273871 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272500 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272502 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272505 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272507 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272510 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272512 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272515 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.272520 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272620 2572 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272626 2572 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272629 2572 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272633 2572 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272636 2572 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272639 2572 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272642 2572 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 19 12:09:46.274357 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272645 2572 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272648 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272650 2572 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272654 2572 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272656 2572 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272659 2572 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272662 2572 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272664 2572 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272667 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272669 2572 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272672 2572 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272675 2572 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272677 2572 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272680 2572 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272682 2572 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272685 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272688 2572 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272690 2572 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272693 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272695 2572 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 19 12:09:46.274733 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272698 2572 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272701 2572 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272703 2572 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272706 2572 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272710 2572 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272714 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272718 2572 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272721 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272724 2572 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272726 2572 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272729 2572 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272731 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272734 2572 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272737 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272739 2572 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272742 2572 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272745 2572 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272747 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272750 2572 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 19 12:09:46.275231 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272752 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272755 2572 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272784 2572 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272787 2572 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272790 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272792 2572 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272795 2572 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272798 2572 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272800 2572 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272803 2572 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272805 2572 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272808 2572 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272810 2572 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272813 2572 feature_gate.go:328] unrecognized feature gate: Example Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272816 2572 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272819 2572 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272821 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272825 2572 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272827 2572 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272830 2572 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 19 12:09:46.275694 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272832 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272835 2572 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272838 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272841 2572 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272843 2572 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272846 2572 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272848 2572 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272851 2572 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272853 2572 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272856 2572 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272859 2572 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272862 2572 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272864 2572 feature_gate.go:328] unrecognized feature gate: Example2 Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272867 2572 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272870 2572 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272872 2572 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272875 2572 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272877 2572 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272880 2572 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 19 12:09:46.276183 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:46.272882 2572 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 19 12:09:46.276647 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.272887 2572 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 19 12:09:46.276647 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.274319 2572 server.go:962] "Client rotation is on, will bootstrap in background" Apr 19 12:09:46.276647 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.276166 2572 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 19 12:09:46.277062 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.277043 2572 server.go:1019] "Starting client certificate rotation" Apr 19 12:09:46.277178 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.277159 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:09:46.277224 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.277215 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 19 12:09:46.300175 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.300157 2572 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:09:46.302973 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.302927 2572 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 19 12:09:46.318852 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.318831 2572 log.go:25] "Validated CRI v1 runtime API" Apr 19 12:09:46.325846 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.325830 2572 log.go:25] "Validated CRI v1 image API" Apr 19 12:09:46.327020 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.327000 2572 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 19 12:09:46.329542 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.329522 2572 fs.go:135] Filesystem UUIDs: map[431c04fa-1222-4418-a35d-eb61a5048ad7:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 dfff33ee-c3c5-495d-9577-d202917717e8:/dev/nvme0n1p3] Apr 19 12:09:46.329597 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.329542 2572 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 19 12:09:46.335352 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.335242 2572 manager.go:217] Machine: {Timestamp:2026-04-19 12:09:46.333220196 +0000 UTC m=+0.400392078 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3100443 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec200d79a04c6a1cd1383b249929883f SystemUUID:ec200d79-a04c-6a1c-d138-3b249929883f BootID:3d0973cb-809a-4b4a-9113-5543c99d0382 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:63:5a:8b:bd:9b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:63:5a:8b:bd:9b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:62:c3:1d:90:2e:91 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 19 12:09:46.335352 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.335347 2572 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 19 12:09:46.335470 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.335444 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:09:46.335470 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.335466 2572 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 19 12:09:46.337723 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.337698 2572 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 19 12:09:46.337875 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.337726 2572 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-140-237.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 19 12:09:46.337920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.337885 2572 topology_manager.go:138] "Creating topology manager with none policy" Apr 19 12:09:46.337920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.337894 2572 container_manager_linux.go:306] "Creating device plugin manager" Apr 19 12:09:46.337920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.337912 2572 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:09:46.339482 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.339471 2572 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 19 12:09:46.340820 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.340810 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:09:46.341105 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.341095 2572 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 19 12:09:46.343492 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.343483 2572 kubelet.go:491] "Attempting to sync node with API server" Apr 19 12:09:46.343532 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.343497 2572 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 19 12:09:46.343532 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.343510 2572 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 19 12:09:46.343532 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.343522 2572 kubelet.go:397] "Adding apiserver pod source" Apr 19 12:09:46.343532 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.343530 2572 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 19 12:09:46.345183 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.345171 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:09:46.345224 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.345189 2572 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 19 12:09:46.349000 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.348978 2572 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 19 12:09:46.350942 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.350923 2572 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 19 12:09:46.352248 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352232 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 19 12:09:46.352248 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352252 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352259 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352264 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352270 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352275 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352281 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352286 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352293 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352299 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352308 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 19 12:09:46.352392 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352317 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 19 12:09:46.352692 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.352435 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-66rg5" Apr 19 12:09:46.353238 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.353227 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 19 12:09:46.353269 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.353239 2572 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 19 12:09:46.354215 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.354175 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-140-237.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 19 12:09:46.354312 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.354239 2572 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 19 12:09:46.355702 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.355687 2572 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-140-237.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 19 12:09:46.357040 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.356906 2572 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 19 12:09:46.357086 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.357064 2572 server.go:1295] "Started kubelet" Apr 19 12:09:46.357176 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.357150 2572 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 19 12:09:46.357208 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.357157 2572 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 19 12:09:46.357245 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.357225 2572 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 19 12:09:46.357903 ip-10-0-140-237 systemd[1]: Started Kubernetes Kubelet. Apr 19 12:09:46.359196 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.359179 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-66rg5" Apr 19 12:09:46.359663 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.359651 2572 server.go:317] "Adding debug handlers to kubelet server" Apr 19 12:09:46.361852 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.361835 2572 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 19 12:09:46.366013 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.365994 2572 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 19 12:09:46.366288 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.366275 2572 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 19 12:09:46.366775 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.366736 2572 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 19 12:09:46.367351 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367338 2572 factory.go:55] Registering systemd factory Apr 19 12:09:46.367411 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367382 2572 factory.go:223] Registration of the systemd container factory successfully Apr 19 12:09:46.367592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367579 2572 factory.go:153] Registering CRI-O factory Apr 19 12:09:46.367592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367593 2572 factory.go:223] Registration of the crio container factory successfully Apr 19 12:09:46.367676 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367661 2572 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 19 12:09:46.367709 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367681 2572 factory.go:103] Registering Raw factory Apr 19 12:09:46.367709 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.367697 2572 manager.go:1196] Started watching for new ooms in manager Apr 19 12:09:46.368082 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368070 2572 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 19 12:09:46.368121 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368070 2572 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 19 12:09:46.368121 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368094 2572 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 19 12:09:46.368171 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368152 2572 manager.go:319] Starting recovery of all containers Apr 19 12:09:46.368201 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368173 2572 reconstruct.go:97] "Volume reconstruction finished" Apr 19 12:09:46.368201 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368182 2572 reconciler.go:26] "Reconciler: start to sync state" Apr 19 12:09:46.368314 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.368296 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.368465 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.368447 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:46.371792 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.371729 2572 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-140-237.ec2.internal\" not found" node="ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.379429 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.379412 2572 manager.go:324] Recovery completed Apr 19 12:09:46.383357 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.383345 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:46.385723 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.385707 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:46.385810 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.385735 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:46.385810 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.385746 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:46.386206 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.386191 2572 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 19 12:09:46.386206 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.386205 2572 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 19 12:09:46.386269 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.386220 2572 state_mem.go:36] "Initialized new in-memory state store" Apr 19 12:09:46.388571 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.388558 2572 policy_none.go:49] "None policy: Start" Apr 19 12:09:46.388630 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.388574 2572 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 19 12:09:46.388630 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.388584 2572 state_mem.go:35] "Initializing new in-memory state store" Apr 19 12:09:46.419361 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419342 2572 manager.go:341] "Starting Device Plugin manager" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.419413 2572 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419427 2572 server.go:85] "Starting device plugin registration server" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419690 2572 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419702 2572 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419822 2572 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419896 2572 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.419905 2572 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.420394 2572 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 19 12:09:46.436529 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.420440 2572 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.491460 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.491423 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 19 12:09:46.492634 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.492615 2572 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 19 12:09:46.492746 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.492642 2572 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 19 12:09:46.492746 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.492660 2572 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 19 12:09:46.492746 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.492667 2572 kubelet.go:2451] "Starting kubelet main sync loop" Apr 19 12:09:46.492746 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.492707 2572 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 19 12:09:46.495366 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.495349 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:46.520366 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.520298 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:46.521476 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.521461 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:46.521579 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.521495 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:46.521579 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.521510 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:46.521579 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.521540 2572 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.530063 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.530045 2572 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.530157 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.530073 2572 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-140-237.ec2.internal\": node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.540005 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.539983 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.593166 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.593132 2572 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal"] Apr 19 12:09:46.593259 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.593227 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:46.596140 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.596125 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:46.596210 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.596154 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:46.596210 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.596164 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:46.597714 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.597702 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:46.597868 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.597852 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.597927 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.597882 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:46.598412 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.598394 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:46.598490 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.598395 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:46.598490 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.598443 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:46.598490 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.598458 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:46.598490 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.598422 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:46.598666 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.598501 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:46.599966 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.599948 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.600055 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.599979 2572 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 19 12:09:46.600681 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.600667 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientMemory" Apr 19 12:09:46.600769 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.600694 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasNoDiskPressure" Apr 19 12:09:46.600769 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.600709 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeHasSufficientPID" Apr 19 12:09:46.616323 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.616304 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-237.ec2.internal\" not found" node="ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.620205 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.620189 2572 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-140-237.ec2.internal\" not found" node="ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.640049 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.640031 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.669830 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.669802 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/071de6a6b9846878c25eec7055bc6997-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal\" (UID: \"071de6a6b9846878c25eec7055bc6997\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.669932 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.669832 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c7cf47197c65590d0f81e01fc4711d8-config\") pod \"kube-apiserver-proxy-ip-10-0-140-237.ec2.internal\" (UID: \"5c7cf47197c65590d0f81e01fc4711d8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.669932 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.669852 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/071de6a6b9846878c25eec7055bc6997-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal\" (UID: \"071de6a6b9846878c25eec7055bc6997\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.740417 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.740387 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.770918 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.770851 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c7cf47197c65590d0f81e01fc4711d8-config\") pod \"kube-apiserver-proxy-ip-10-0-140-237.ec2.internal\" (UID: \"5c7cf47197c65590d0f81e01fc4711d8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.770918 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.770898 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/071de6a6b9846878c25eec7055bc6997-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal\" (UID: \"071de6a6b9846878c25eec7055bc6997\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.770918 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.770914 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/071de6a6b9846878c25eec7055bc6997-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal\" (UID: \"071de6a6b9846878c25eec7055bc6997\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.771070 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.770952 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5c7cf47197c65590d0f81e01fc4711d8-config\") pod \"kube-apiserver-proxy-ip-10-0-140-237.ec2.internal\" (UID: \"5c7cf47197c65590d0f81e01fc4711d8\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.771070 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.770966 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/071de6a6b9846878c25eec7055bc6997-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal\" (UID: \"071de6a6b9846878c25eec7055bc6997\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.771070 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.770990 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/071de6a6b9846878c25eec7055bc6997-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal\" (UID: \"071de6a6b9846878c25eec7055bc6997\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.841153 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.841109 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:46.917869 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.917842 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.921643 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:46.921627 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:46.941429 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:46.941392 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.041963 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:47.041869 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.142408 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:47.142376 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.243050 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:47.243012 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.277562 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.277541 2572 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 19 12:09:47.278205 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.277693 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:09:47.278205 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.277700 2572 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 19 12:09:47.343448 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:47.343391 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.354304 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.354282 2572 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:47.362989 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.362963 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-18 12:04:46 +0000 UTC" deadline="2028-02-01 18:57:42.994937304 +0000 UTC" Apr 19 12:09:47.362989 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.362985 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15678h47m55.63195438s" Apr 19 12:09:47.366409 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.366393 2572 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 19 12:09:47.375145 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.375127 2572 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 19 12:09:47.394410 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.394376 2572 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-np7kc" Apr 19 12:09:47.401566 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.401546 2572 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-np7kc" Apr 19 12:09:47.444169 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:47.444137 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.497925 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:47.497896 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c7cf47197c65590d0f81e01fc4711d8.slice/crio-92d2b81c5bb8526af8dd73fc6b105b8293f2f1fdaa474182a14b9d94aad6c5f5 WatchSource:0}: Error finding container 92d2b81c5bb8526af8dd73fc6b105b8293f2f1fdaa474182a14b9d94aad6c5f5: Status 404 returned error can't find the container with id 92d2b81c5bb8526af8dd73fc6b105b8293f2f1fdaa474182a14b9d94aad6c5f5 Apr 19 12:09:47.498254 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:47.498231 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod071de6a6b9846878c25eec7055bc6997.slice/crio-e193bd52285d34915cb3f58bf758602fb470fe0e4ce47941c0064d92ecf80f35 WatchSource:0}: Error finding container e193bd52285d34915cb3f58bf758602fb470fe0e4ce47941c0064d92ecf80f35: Status 404 returned error can't find the container with id e193bd52285d34915cb3f58bf758602fb470fe0e4ce47941c0064d92ecf80f35 Apr 19 12:09:47.502657 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.502643 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:09:47.545050 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:47.545020 2572 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-140-237.ec2.internal\" not found" Apr 19 12:09:47.574062 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.574041 2572 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:47.667155 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.667081 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" Apr 19 12:09:47.677388 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.677368 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:47.678312 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.678297 2572 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" Apr 19 12:09:47.688345 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:47.688330 2572 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 19 12:09:48.344975 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.344943 2572 apiserver.go:52] "Watching apiserver" Apr 19 12:09:48.351690 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.351664 2572 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:48.352815 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.352792 2572 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 19 12:09:48.353186 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.353163 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp","openshift-cluster-node-tuning-operator/tuned-ls4sq","openshift-image-registry/node-ca-mdl6z","openshift-multus/multus-cpvgj","openshift-multus/network-metrics-daemon-67dv9","openshift-network-diagnostics/network-check-target-2v66h","openshift-network-operator/iptables-alerter-qbr6x","kube-system/konnectivity-agent-wbtqf","openshift-dns/node-resolver-m897s","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal","openshift-multus/multus-additional-cni-plugins-sqxd4","openshift-ovn-kubernetes/ovnkube-node-mfzf5"] Apr 19 12:09:48.355650 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.355629 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:48.355782 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.355705 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:48.357880 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.357855 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.360033 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.360010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.360876 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.360662 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 19 12:09:48.360876 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.360680 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9s8vz\"" Apr 19 12:09:48.361106 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.360944 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.361232 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.361204 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.362227 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.362207 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.362787 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.362751 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.362967 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.362942 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.365157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.362971 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-qf68d\"" Apr 19 12:09:48.365157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.364711 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-59bnb\"" Apr 19 12:09:48.365157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.364908 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.365157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.365009 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.365157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.365095 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 19 12:09:48.367080 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.367058 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.369120 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369099 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.369214 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.369178 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:48.369281 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369219 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.369489 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369470 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 19 12:09:48.369697 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369679 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.369795 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369721 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q8n5k\"" Apr 19 12:09:48.369795 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369735 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 19 12:09:48.369795 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.369735 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.371441 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.371424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.371692 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.371673 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.371800 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.371783 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-t5cz6\"" Apr 19 12:09:48.371919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.371896 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 19 12:09:48.371919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.371914 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.373451 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.373430 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 19 12:09:48.373610 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.373589 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-66msp\"" Apr 19 12:09:48.373688 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.373649 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.374089 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.374069 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 19 12:09:48.375827 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.375811 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.375914 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.375831 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.375914 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.375887 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-cntlb\"" Apr 19 12:09:48.377538 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.377519 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.377868 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.377848 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-k8s-cni-cncf-io\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.377971 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.377883 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-daemon-config\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.377971 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.377909 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-etc-kubernetes\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.377971 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.377954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-run\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378107 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.377986 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jccmz\" (UniqueName: \"kubernetes.io/projected/5982f902-227e-4317-9969-a90aa85d5e40-kube-api-access-jccmz\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.378107 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378012 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.378107 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378065 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-cni-multus\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378107 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378083 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysctl-conf\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378107 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378098 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5982f902-227e-4317-9969-a90aa85d5e40-serviceca\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-cni-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378126 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-cnibin\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378141 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-cni-bin\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378164 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-multus-certs\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysconfig\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378241 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-sys\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378265 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-var-lib-kubelet\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378291 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5982f902-227e-4317-9969-a90aa85d5e40-host\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378314 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-kubelet\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378332 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378347 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-kubernetes\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378366 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-tuned\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378397 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-socket-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378445 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-registration-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-conf-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378505 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvwvd\" (UniqueName: \"kubernetes.io/projected/afd1b409-8b1c-4984-996b-e66960d52ffc-kube-api-access-xvwvd\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378528 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6799f9ef-a53a-465c-ae04-e93d43acaea2-iptables-alerter-script\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378546 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nfvh\" (UniqueName: \"kubernetes.io/projected/6799f9ef-a53a-465c-ae04-e93d43acaea2-kube-api-access-9nfvh\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-lib-modules\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378620 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-device-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.378659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378644 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-sys-fs\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378671 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-system-cni-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378696 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvlxb\" (UniqueName: \"kubernetes.io/projected/2235efd3-6b65-47c7-acb5-eb9aa104beb5-kube-api-access-cvlxb\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378722 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378744 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysctl-d\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378783 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-systemd\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378807 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-os-release\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378830 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-netns\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-hostroot\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378888 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6799f9ef-a53a-465c-ae04-e93d43acaea2-host-slash\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378917 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-modprobe-d\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378933 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce238b24-1370-4ed8-afca-1aa7573e56b7-tmp\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378948 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4c8z\" (UniqueName: \"kubernetes.io/projected/ce238b24-1370-4ed8-afca-1aa7573e56b7-kube-api-access-c4c8z\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.378963 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-etc-selinux\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379004 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-socket-dir-parent\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379066 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-host\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.379207 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379092 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnf7s\" (UniqueName: \"kubernetes.io/projected/a069367b-8d8c-414b-b7f7-af6d4738d3ba-kube-api-access-wnf7s\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.379875 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379112 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afd1b409-8b1c-4984-996b-e66960d52ffc-cni-binary-copy\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.379875 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379597 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dnr6p\"" Apr 19 12:09:48.379967 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379950 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 19 12:09:48.380002 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379972 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 19 12:09:48.380039 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.379951 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.383446 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.383427 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nd9dx\"" Apr 19 12:09:48.383535 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.383453 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 19 12:09:48.383535 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.383496 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 19 12:09:48.383824 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.383636 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 19 12:09:48.384009 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.383975 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 19 12:09:48.384009 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.384001 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 19 12:09:48.385485 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.385466 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 19 12:09:48.403022 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.402965 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:04:47 +0000 UTC" deadline="2027-11-17 22:14:46.168922049 +0000 UTC" Apr 19 12:09:48.403022 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.403017 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13858h4m57.76590794s" Apr 19 12:09:48.470047 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.470018 2572 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 19 12:09:48.479929 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.479900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-cnibin\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.480067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.479940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-cni-bin\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.480067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.479985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-cni-bin\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.480067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480035 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-cnibin\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.480067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480060 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysconfig\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-var-lib-kubelet\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480109 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-os-release\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480148 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-slash\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480150 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysconfig\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480165 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-cni-netd\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480181 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-var-lib-kubelet\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480190 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-kubelet\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480224 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-kubelet\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.480272 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480232 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480277 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-kubernetes\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.480409 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480420 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-socket-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480435 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-kubernetes\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-registration-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.480486 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:48.980462991 +0000 UTC m=+3.047634875 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480525 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-cnibin\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480529 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-registration-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480558 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480572 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-socket-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480586 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-systemd\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480612 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6799f9ef-a53a-465c-ae04-e93d43acaea2-iptables-alerter-script\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480636 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-device-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480661 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-sys-fs\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480686 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-var-lib-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.480699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480713 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-env-overrides\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480734 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-device-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480743 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-sys-fs\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480738 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-system-cni-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480890 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-system-cni-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481162 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/6799f9ef-a53a-465c-ae04-e93d43acaea2-iptables-alerter-script\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.480803 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a78a63a6-1b78-4253-87ed-341aaf44e16f-konnectivity-ca\") pod \"konnectivity-agent-wbtqf\" (UID: \"a78a63a6-1b78-4253-87ed-341aaf44e16f\") " pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4mb5\" (UniqueName: \"kubernetes.io/projected/d6bccaec-31af-4abb-8335-372cc94da827-kube-api-access-b4mb5\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481344 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-node-log\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481372 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-netns\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481409 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-hostroot\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481458 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-hostroot\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481480 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-netns\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.481504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481500 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce238b24-1370-4ed8-afca-1aa7573e56b7-tmp\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481579 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxfsh\" (UniqueName: \"kubernetes.io/projected/7944a8e8-84c1-4ed9-81a3-27e357098d48-kube-api-access-fxfsh\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481613 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-cni-bin\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481642 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-system-cni-dir\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481677 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481704 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-daemon-config\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481746 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-etc-kubernetes\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481830 2572 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481853 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-run\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-etc-kubernetes\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481922 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481954 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a78a63a6-1b78-4253-87ed-341aaf44e16f-agent-certs\") pod \"konnectivity-agent-wbtqf\" (UID: \"a78a63a6-1b78-4253-87ed-341aaf44e16f\") " pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.481982 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysctl-conf\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482005 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-kubelet-dir\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482008 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5982f902-227e-4317-9969-a90aa85d5e40-serviceca\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.482071 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482045 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-cni-binary-copy\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482084 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-run-netns\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482110 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-ovn\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482147 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysctl-conf\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482154 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-ovnkube-config\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482187 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-cni-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482194 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-run\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482218 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-multus-certs\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-sys\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482256 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-multus-certs\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482274 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-daemon-config\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5982f902-227e-4317-9969-a90aa85d5e40-host\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482315 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-etc-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482321 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-sys\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482279 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-cni-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482340 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482365 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/5982f902-227e-4317-9969-a90aa85d5e40-host\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482367 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.482809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-tuned\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482405 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/5982f902-227e-4317-9969-a90aa85d5e40-serviceca\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482458 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-kubelet\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482486 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-conf-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xvwvd\" (UniqueName: \"kubernetes.io/projected/afd1b409-8b1c-4984-996b-e66960d52ffc-kube-api-access-xvwvd\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482542 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nfvh\" (UniqueName: \"kubernetes.io/projected/6799f9ef-a53a-465c-ae04-e93d43acaea2-kube-api-access-9nfvh\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-lib-modules\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482575 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-conf-dir\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482591 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-systemd-units\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482618 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-ovnkube-script-lib\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482659 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvlxb\" (UniqueName: \"kubernetes.io/projected/2235efd3-6b65-47c7-acb5-eb9aa104beb5-kube-api-access-cvlxb\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482745 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482771 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-lib-modules\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482790 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysctl-d\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482816 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-systemd\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482845 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/473da951-aee0-495d-9114-913f8c86e8e0-ovn-node-metrics-cert\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482874 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-os-release\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.483745 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482925 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-systemd\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483090 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6799f9ef-a53a-465c-ae04-e93d43acaea2-host-slash\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482934 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-sysctl-d\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.482985 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-os-release\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-modprobe-d\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483148 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c4c8z\" (UniqueName: \"kubernetes.io/projected/ce238b24-1370-4ed8-afca-1aa7573e56b7-kube-api-access-c4c8z\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483173 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-etc-selinux\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483198 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wnf7s\" (UniqueName: \"kubernetes.io/projected/a069367b-8d8c-414b-b7f7-af6d4738d3ba-kube-api-access-wnf7s\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483205 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/6799f9ef-a53a-465c-ae04-e93d43acaea2-host-slash\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483257 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483286 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-socket-dir-parent\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483310 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-host\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483339 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d6bccaec-31af-4abb-8335-372cc94da827-tmp-dir\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483387 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-log-socket\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483399 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-multus-socket-dir-parent\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483414 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afd1b409-8b1c-4984-996b-e66960d52ffc-cni-binary-copy\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483442 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-k8s-cni-cncf-io\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.484432 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jccmz\" (UniqueName: \"kubernetes.io/projected/5982f902-227e-4317-9969-a90aa85d5e40-kube-api-access-jccmz\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483487 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-modprobe-d\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483498 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483580 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a069367b-8d8c-414b-b7f7-af6d4738d3ba-etc-selinux\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483620 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ce238b24-1370-4ed8-afca-1aa7573e56b7-host\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483654 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-run-k8s-cni-cncf-io\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483691 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-cni-multus\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483718 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d6bccaec-31af-4abb-8335-372cc94da827-hosts-file\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483745 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwkfs\" (UniqueName: \"kubernetes.io/projected/473da951-aee0-495d-9114-913f8c86e8e0-kube-api-access-lwkfs\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.483804 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/afd1b409-8b1c-4984-996b-e66960d52ffc-host-var-lib-cni-multus\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.485067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.484058 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/afd1b409-8b1c-4984-996b-e66960d52ffc-cni-binary-copy\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.485438 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.485134 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ce238b24-1370-4ed8-afca-1aa7573e56b7-tmp\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.485438 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.485158 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ce238b24-1370-4ed8-afca-1aa7573e56b7-etc-tuned\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.490049 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.490026 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:48.490171 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.490053 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:48.490171 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.490067 2572 projected.go:194] Error preparing data for projected volume kube-api-access-67tn8 for pod openshift-network-diagnostics/network-check-target-2v66h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:48.490171 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.490132 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8 podName:29c3436d-ad1d-4386-a9d8-894a3c87dbfc nodeName:}" failed. No retries permitted until 2026-04-19 12:09:48.990114965 +0000 UTC m=+3.057286850 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-67tn8" (UniqueName: "kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8") pod "network-check-target-2v66h" (UID: "29c3436d-ad1d-4386-a9d8-894a3c87dbfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:48.491736 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.491461 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nfvh\" (UniqueName: \"kubernetes.io/projected/6799f9ef-a53a-465c-ae04-e93d43acaea2-kube-api-access-9nfvh\") pod \"iptables-alerter-qbr6x\" (UID: \"6799f9ef-a53a-465c-ae04-e93d43acaea2\") " pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.492158 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.492137 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvwvd\" (UniqueName: \"kubernetes.io/projected/afd1b409-8b1c-4984-996b-e66960d52ffc-kube-api-access-xvwvd\") pod \"multus-cpvgj\" (UID: \"afd1b409-8b1c-4984-996b-e66960d52ffc\") " pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.492305 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.492269 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnf7s\" (UniqueName: \"kubernetes.io/projected/a069367b-8d8c-414b-b7f7-af6d4738d3ba-kube-api-access-wnf7s\") pod \"aws-ebs-csi-driver-node-x62gp\" (UID: \"a069367b-8d8c-414b-b7f7-af6d4738d3ba\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.492702 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.492676 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jccmz\" (UniqueName: \"kubernetes.io/projected/5982f902-227e-4317-9969-a90aa85d5e40-kube-api-access-jccmz\") pod \"node-ca-mdl6z\" (UID: \"5982f902-227e-4317-9969-a90aa85d5e40\") " pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.492797 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.492715 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvlxb\" (UniqueName: \"kubernetes.io/projected/2235efd3-6b65-47c7-acb5-eb9aa104beb5-kube-api-access-cvlxb\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.493551 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.493533 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4c8z\" (UniqueName: \"kubernetes.io/projected/ce238b24-1370-4ed8-afca-1aa7573e56b7-kube-api-access-c4c8z\") pod \"tuned-ls4sq\" (UID: \"ce238b24-1370-4ed8-afca-1aa7573e56b7\") " pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.499363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.499314 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" event={"ID":"071de6a6b9846878c25eec7055bc6997","Type":"ContainerStarted","Data":"e193bd52285d34915cb3f58bf758602fb470fe0e4ce47941c0064d92ecf80f35"} Apr 19 12:09:48.500376 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.500346 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" event={"ID":"5c7cf47197c65590d0f81e01fc4711d8","Type":"ContainerStarted","Data":"92d2b81c5bb8526af8dd73fc6b105b8293f2f1fdaa474182a14b9d94aad6c5f5"} Apr 19 12:09:48.584455 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584419 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-systemd-units\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.584455 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584460 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-ovnkube-script-lib\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.584652 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584528 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-systemd-units\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.584652 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584556 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/473da951-aee0-495d-9114-913f8c86e8e0-ovn-node-metrics-cert\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.584726 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584698 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.584783 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d6bccaec-31af-4abb-8335-372cc94da827-tmp-dir\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.584828 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-log-socket\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.584828 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584809 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.584920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584837 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d6bccaec-31af-4abb-8335-372cc94da827-hosts-file\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.584920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584864 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lwkfs\" (UniqueName: \"kubernetes.io/projected/473da951-aee0-495d-9114-913f8c86e8e0-kube-api-access-lwkfs\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.584920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584893 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-os-release\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.584920 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584906 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-log-socket\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584970 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-slash\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.584918 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-slash\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585011 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-tuning-conf-dir\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-os-release\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585014 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-cni-netd\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585050 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-cni-netd\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585088 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-cnibin\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/d6bccaec-31af-4abb-8335-372cc94da827-hosts-file\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585120 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585160 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-systemd\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585185 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-cnibin\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585192 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-var-lib-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585224 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-env-overrides\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585235 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-var-lib-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585254 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a78a63a6-1b78-4253-87ed-341aaf44e16f-konnectivity-ca\") pod \"konnectivity-agent-wbtqf\" (UID: \"a78a63a6-1b78-4253-87ed-341aaf44e16f\") " pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-systemd\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585280 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b4mb5\" (UniqueName: \"kubernetes.io/projected/d6bccaec-31af-4abb-8335-372cc94da827-kube-api-access-b4mb5\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585353 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d6bccaec-31af-4abb-8335-372cc94da827-tmp-dir\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.585398 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-node-log\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fxfsh\" (UniqueName: \"kubernetes.io/projected/7944a8e8-84c1-4ed9-81a3-27e357098d48-kube-api-access-fxfsh\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585448 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-cni-bin\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585472 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-system-cni-dir\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585495 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585524 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a78a63a6-1b78-4253-87ed-341aaf44e16f-agent-certs\") pod \"konnectivity-agent-wbtqf\" (UID: \"a78a63a6-1b78-4253-87ed-341aaf44e16f\") " pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585552 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-cni-binary-copy\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585576 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-run-netns\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585599 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-ovn\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585626 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-ovnkube-config\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585588 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585659 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-node-log\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7944a8e8-84c1-4ed9-81a3-27e357098d48-system-cni-dir\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585735 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-ovn\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.585740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585741 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a78a63a6-1b78-4253-87ed-341aaf44e16f-konnectivity-ca\") pod \"konnectivity-agent-wbtqf\" (UID: \"a78a63a6-1b78-4253-87ed-341aaf44e16f\") " pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585748 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-cni-bin\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585738 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-run-netns\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585802 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-run-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-etc-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585875 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585902 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.585933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-kubelet\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586009 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-kubelet\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586048 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-etc-openvswitch\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586081 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-run-ovn-kubernetes\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586115 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/473da951-aee0-495d-9114-913f8c86e8e0-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586197 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7944a8e8-84c1-4ed9-81a3-27e357098d48-cni-binary-copy\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586206 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-ovnkube-script-lib\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586244 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-env-overrides\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.586363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.586296 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/473da951-aee0-495d-9114-913f8c86e8e0-ovnkube-config\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.588039 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.588018 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/473da951-aee0-495d-9114-913f8c86e8e0-ovn-node-metrics-cert\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.588283 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.588266 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a78a63a6-1b78-4253-87ed-341aaf44e16f-agent-certs\") pod \"konnectivity-agent-wbtqf\" (UID: \"a78a63a6-1b78-4253-87ed-341aaf44e16f\") " pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.593145 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.593127 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4mb5\" (UniqueName: \"kubernetes.io/projected/d6bccaec-31af-4abb-8335-372cc94da827-kube-api-access-b4mb5\") pod \"node-resolver-m897s\" (UID: \"d6bccaec-31af-4abb-8335-372cc94da827\") " pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.593637 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.593618 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxfsh\" (UniqueName: \"kubernetes.io/projected/7944a8e8-84c1-4ed9-81a3-27e357098d48-kube-api-access-fxfsh\") pod \"multus-additional-cni-plugins-sqxd4\" (UID: \"7944a8e8-84c1-4ed9-81a3-27e357098d48\") " pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.593781 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.593752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwkfs\" (UniqueName: \"kubernetes.io/projected/473da951-aee0-495d-9114-913f8c86e8e0-kube-api-access-lwkfs\") pod \"ovnkube-node-mfzf5\" (UID: \"473da951-aee0-495d-9114-913f8c86e8e0\") " pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.671977 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.671875 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" Apr 19 12:09:48.679669 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.679646 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" Apr 19 12:09:48.688439 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.688416 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mdl6z" Apr 19 12:09:48.694041 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.694020 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-cpvgj" Apr 19 12:09:48.701600 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.701579 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-qbr6x" Apr 19 12:09:48.708179 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.708157 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:09:48.713678 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.713657 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m897s" Apr 19 12:09:48.720541 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.720510 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" Apr 19 12:09:48.725232 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.725216 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:09:48.819988 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.819957 2572 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 19 12:09:48.988299 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:48.988223 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:48.988441 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.988340 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:48.988441 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:48.988402 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:49.988385639 +0000 UTC m=+4.055557511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:49.088605 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.088581 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:49.088808 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.088789 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:49.088857 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.088818 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:49.088857 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.088836 2572 projected.go:194] Error preparing data for projected volume kube-api-access-67tn8 for pod openshift-network-diagnostics/network-check-target-2v66h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:49.088913 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.088902 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8 podName:29c3436d-ad1d-4386-a9d8-894a3c87dbfc nodeName:}" failed. No retries permitted until 2026-04-19 12:09:50.088883325 +0000 UTC m=+4.156055207 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-67tn8" (UniqueName: "kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8") pod "network-check-target-2v66h" (UID: "29c3436d-ad1d-4386-a9d8-894a3c87dbfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:49.097776 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.097737 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5982f902_227e_4317_9969_a90aa85d5e40.slice/crio-0358c0f7cdcce311fc7df08ebeda58c84ea7954cf2eac38ee3a57b8a1b67d240 WatchSource:0}: Error finding container 0358c0f7cdcce311fc7df08ebeda58c84ea7954cf2eac38ee3a57b8a1b67d240: Status 404 returned error can't find the container with id 0358c0f7cdcce311fc7df08ebeda58c84ea7954cf2eac38ee3a57b8a1b67d240 Apr 19 12:09:49.099066 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.099036 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda78a63a6_1b78_4253_87ed_341aaf44e16f.slice/crio-f6fa0eab78e8b158eedc31b638c29d063c308f4b1e53eba1a6baf28f0f95566f WatchSource:0}: Error finding container f6fa0eab78e8b158eedc31b638c29d063c308f4b1e53eba1a6baf28f0f95566f: Status 404 returned error can't find the container with id f6fa0eab78e8b158eedc31b638c29d063c308f4b1e53eba1a6baf28f0f95566f Apr 19 12:09:49.102418 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.102387 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod473da951_aee0_495d_9114_913f8c86e8e0.slice/crio-ebb013d46cc18b5d7339566df231f2e1b4f703dfde490fc12c1113361404278a WatchSource:0}: Error finding container ebb013d46cc18b5d7339566df231f2e1b4f703dfde490fc12c1113361404278a: Status 404 returned error can't find the container with id ebb013d46cc18b5d7339566df231f2e1b4f703dfde490fc12c1113361404278a Apr 19 12:09:49.104780 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.104742 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6799f9ef_a53a_465c_ae04_e93d43acaea2.slice/crio-47f6ac1738274fc86d33221ed2c40da9c329e97298146499fed880156c8d6a0f WatchSource:0}: Error finding container 47f6ac1738274fc86d33221ed2c40da9c329e97298146499fed880156c8d6a0f: Status 404 returned error can't find the container with id 47f6ac1738274fc86d33221ed2c40da9c329e97298146499fed880156c8d6a0f Apr 19 12:09:49.107994 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.107969 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7944a8e8_84c1_4ed9_81a3_27e357098d48.slice/crio-58349ed78416b4f5a82e75d30ba939ac122771275c0e48bf77d49505f16bcb46 WatchSource:0}: Error finding container 58349ed78416b4f5a82e75d30ba939ac122771275c0e48bf77d49505f16bcb46: Status 404 returned error can't find the container with id 58349ed78416b4f5a82e75d30ba939ac122771275c0e48bf77d49505f16bcb46 Apr 19 12:09:49.108412 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.108298 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6bccaec_31af_4abb_8335_372cc94da827.slice/crio-cf95718b9f63fb47ca162f25d7cbec7221c17b8c0e1f372d7d2833dd7e32b74d WatchSource:0}: Error finding container cf95718b9f63fb47ca162f25d7cbec7221c17b8c0e1f372d7d2833dd7e32b74d: Status 404 returned error can't find the container with id cf95718b9f63fb47ca162f25d7cbec7221c17b8c0e1f372d7d2833dd7e32b74d Apr 19 12:09:49.109360 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.109225 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafd1b409_8b1c_4984_996b_e66960d52ffc.slice/crio-1b51b285d2be4c843d0bf4e2435d0fadc77f921fd661a3a10ae41fc02ad3456b WatchSource:0}: Error finding container 1b51b285d2be4c843d0bf4e2435d0fadc77f921fd661a3a10ae41fc02ad3456b: Status 404 returned error can't find the container with id 1b51b285d2be4c843d0bf4e2435d0fadc77f921fd661a3a10ae41fc02ad3456b Apr 19 12:09:49.110034 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.110010 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce238b24_1370_4ed8_afca_1aa7573e56b7.slice/crio-25dd9bfe78135be71b3444dea4a56076142bc76bc1026b8fd50d88f54d89cc51 WatchSource:0}: Error finding container 25dd9bfe78135be71b3444dea4a56076142bc76bc1026b8fd50d88f54d89cc51: Status 404 returned error can't find the container with id 25dd9bfe78135be71b3444dea4a56076142bc76bc1026b8fd50d88f54d89cc51 Apr 19 12:09:49.111042 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:09:49.111016 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda069367b_8d8c_414b_b7f7_af6d4738d3ba.slice/crio-faea1d2111dc285d025f672510115bdc5e407c2df0790b5f266d58e0148aa55c WatchSource:0}: Error finding container faea1d2111dc285d025f672510115bdc5e407c2df0790b5f266d58e0148aa55c: Status 404 returned error can't find the container with id faea1d2111dc285d025f672510115bdc5e407c2df0790b5f266d58e0148aa55c Apr 19 12:09:49.403664 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.403558 2572 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-18 12:04:47 +0000 UTC" deadline="2028-01-06 00:44:43.138214298 +0000 UTC" Apr 19 12:09:49.403664 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.403599 2572 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15036h34m53.734617631s" Apr 19 12:09:49.493668 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.493637 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:49.493854 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.493778 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:49.494259 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.494235 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:49.494361 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.494346 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:49.507004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.506974 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerStarted","Data":"58349ed78416b4f5a82e75d30ba939ac122771275c0e48bf77d49505f16bcb46"} Apr 19 12:09:49.508929 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.508864 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"ebb013d46cc18b5d7339566df231f2e1b4f703dfde490fc12c1113361404278a"} Apr 19 12:09:49.512336 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.512290 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qbr6x" event={"ID":"6799f9ef-a53a-465c-ae04-e93d43acaea2","Type":"ContainerStarted","Data":"47f6ac1738274fc86d33221ed2c40da9c329e97298146499fed880156c8d6a0f"} Apr 19 12:09:49.517887 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.517553 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wbtqf" event={"ID":"a78a63a6-1b78-4253-87ed-341aaf44e16f","Type":"ContainerStarted","Data":"f6fa0eab78e8b158eedc31b638c29d063c308f4b1e53eba1a6baf28f0f95566f"} Apr 19 12:09:49.520932 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.520909 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mdl6z" event={"ID":"5982f902-227e-4317-9969-a90aa85d5e40","Type":"ContainerStarted","Data":"0358c0f7cdcce311fc7df08ebeda58c84ea7954cf2eac38ee3a57b8a1b67d240"} Apr 19 12:09:49.524521 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.524497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" event={"ID":"5c7cf47197c65590d0f81e01fc4711d8","Type":"ContainerStarted","Data":"f8ea78834012d2f61a0dbb42ba6e902906657e6a49c2705dc902b1b8f727e8f9"} Apr 19 12:09:49.529704 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.529511 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" event={"ID":"a069367b-8d8c-414b-b7f7-af6d4738d3ba","Type":"ContainerStarted","Data":"faea1d2111dc285d025f672510115bdc5e407c2df0790b5f266d58e0148aa55c"} Apr 19 12:09:49.532193 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.532002 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" event={"ID":"ce238b24-1370-4ed8-afca-1aa7573e56b7","Type":"ContainerStarted","Data":"25dd9bfe78135be71b3444dea4a56076142bc76bc1026b8fd50d88f54d89cc51"} Apr 19 12:09:49.534167 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.534118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m897s" event={"ID":"d6bccaec-31af-4abb-8335-372cc94da827","Type":"ContainerStarted","Data":"cf95718b9f63fb47ca162f25d7cbec7221c17b8c0e1f372d7d2833dd7e32b74d"} Apr 19 12:09:49.535715 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.535677 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cpvgj" event={"ID":"afd1b409-8b1c-4984-996b-e66960d52ffc","Type":"ContainerStarted","Data":"1b51b285d2be4c843d0bf4e2435d0fadc77f921fd661a3a10ae41fc02ad3456b"} Apr 19 12:09:49.548791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.548173 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-140-237.ec2.internal" podStartSLOduration=2.548157786 podStartE2EDuration="2.548157786s" podCreationTimestamp="2026-04-19 12:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:09:49.547676059 +0000 UTC m=+3.614847951" watchObservedRunningTime="2026-04-19 12:09:49.548157786 +0000 UTC m=+3.615329679" Apr 19 12:09:49.999386 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:49.999302 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:49.999557 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.999462 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:49.999557 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:49.999526 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:51.999502631 +0000 UTC m=+6.066674502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:50.101249 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:50.100356 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:50.101249 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:50.100550 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:50.101249 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:50.100577 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:50.101249 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:50.100590 2572 projected.go:194] Error preparing data for projected volume kube-api-access-67tn8 for pod openshift-network-diagnostics/network-check-target-2v66h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:50.101249 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:50.100651 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8 podName:29c3436d-ad1d-4386-a9d8-894a3c87dbfc nodeName:}" failed. No retries permitted until 2026-04-19 12:09:52.100633071 +0000 UTC m=+6.167804943 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-67tn8" (UniqueName: "kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8") pod "network-check-target-2v66h" (UID: "29c3436d-ad1d-4386-a9d8-894a3c87dbfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:50.550541 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:50.550501 2572 generic.go:358] "Generic (PLEG): container finished" podID="071de6a6b9846878c25eec7055bc6997" containerID="49ad90cf0ac05036d835b1bbb384cb5e04c600015ca8cd09f98abd66568b4642" exitCode=0 Apr 19 12:09:50.551093 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:50.550640 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" event={"ID":"071de6a6b9846878c25eec7055bc6997","Type":"ContainerDied","Data":"49ad90cf0ac05036d835b1bbb384cb5e04c600015ca8cd09f98abd66568b4642"} Apr 19 12:09:51.493192 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:51.493161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:51.493374 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:51.493161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:51.493374 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:51.493295 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:51.493490 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:51.493346 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:51.564104 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:51.563332 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" event={"ID":"071de6a6b9846878c25eec7055bc6997","Type":"ContainerStarted","Data":"d017c6b6791465aec8576235fb86954c7673ec0dab33f73bd995e683ef6f7180"} Apr 19 12:09:52.019467 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.019429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:52.019668 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.019578 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:52.019726 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.019674 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:56.019654736 +0000 UTC m=+10.086826607 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:52.037593 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.037538 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-140-237.ec2.internal" podStartSLOduration=5.037519169 podStartE2EDuration="5.037519169s" podCreationTimestamp="2026-04-19 12:09:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:09:51.578588426 +0000 UTC m=+5.645760319" watchObservedRunningTime="2026-04-19 12:09:52.037519169 +0000 UTC m=+6.104691061" Apr 19 12:09:52.038634 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.038600 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-pccgh"] Apr 19 12:09:52.041418 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.041399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.041528 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.041474 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:09:52.120130 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.120051 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:52.120130 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.120114 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-kubelet-config\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.120359 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.120146 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-dbus\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.120359 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.120202 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:52.120359 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.120228 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:52.120359 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.120240 2572 projected.go:194] Error preparing data for projected volume kube-api-access-67tn8 for pod openshift-network-diagnostics/network-check-target-2v66h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:52.120359 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.120296 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8 podName:29c3436d-ad1d-4386-a9d8-894a3c87dbfc nodeName:}" failed. No retries permitted until 2026-04-19 12:09:56.120276209 +0000 UTC m=+10.187448090 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-67tn8" (UniqueName: "kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8") pod "network-check-target-2v66h" (UID: "29c3436d-ad1d-4386-a9d8-894a3c87dbfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:52.120359 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.120208 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.221505 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-kubelet-config\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.221553 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-dbus\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.221583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.221660 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-kubelet-config\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.221725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-dbus\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.221740 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:52.222029 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.221822 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret podName:4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:52.721802677 +0000 UTC m=+6.788974551 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret") pod "global-pull-secret-syncer-pccgh" (UID: "4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:52.726675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:52.726640 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:52.727140 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.726857 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:52.727140 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:52.726917 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret podName:4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:53.726896214 +0000 UTC m=+7.794068084 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret") pod "global-pull-secret-syncer-pccgh" (UID: "4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:53.493791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:53.493736 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:53.493995 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:53.493874 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:53.493995 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:53.493912 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:53.493995 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:53.493983 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:53.494164 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:53.494029 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:53.494164 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:53.494105 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:09:53.735933 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:53.735891 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:53.736405 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:53.736070 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:53.736405 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:53.736132 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret podName:4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:55.736113303 +0000 UTC m=+9.803285175 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret") pod "global-pull-secret-syncer-pccgh" (UID: "4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:55.493041 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:55.492966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:55.493041 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:55.493010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:55.493553 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:55.492981 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:55.493553 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:55.493094 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:55.493553 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:55.493179 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:09:55.493553 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:55.493281 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:55.752918 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:55.752879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:55.753079 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:55.753057 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:55.753170 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:55.753128 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret podName:4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82 nodeName:}" failed. No retries permitted until 2026-04-19 12:09:59.753110924 +0000 UTC m=+13.820282796 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret") pod "global-pull-secret-syncer-pccgh" (UID: "4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:56.055544 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:56.055403 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:56.055698 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:56.055573 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:56.055698 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:56.055646 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:04.055625958 +0000 UTC m=+18.122797854 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:09:56.156877 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:56.156613 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:56.156877 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:56.156854 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:09:56.156877 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:56.156875 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:09:56.156877 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:56.156888 2572 projected.go:194] Error preparing data for projected volume kube-api-access-67tn8 for pod openshift-network-diagnostics/network-check-target-2v66h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:56.157187 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:56.156950 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8 podName:29c3436d-ad1d-4386-a9d8-894a3c87dbfc nodeName:}" failed. No retries permitted until 2026-04-19 12:10:04.156931219 +0000 UTC m=+18.224103089 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-67tn8" (UniqueName: "kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8") pod "network-check-target-2v66h" (UID: "29c3436d-ad1d-4386-a9d8-894a3c87dbfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:09:57.493363 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:57.493327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:57.493840 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:57.493366 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:57.493840 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:57.493327 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:57.493840 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:57.493472 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:57.493840 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:57.493607 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:57.493840 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:57.493693 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:09:59.493517 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:59.493477 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:59.493973 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:59.493558 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:09:59.493973 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:59.493644 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:09:59.493973 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:59.493664 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:09:59.493973 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:59.493723 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:09:59.493973 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:59.493813 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:09:59.787265 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:09:59.787225 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:09:59.787438 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:59.787363 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:09:59.787438 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:09:59.787434 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret podName:4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:07.787415683 +0000 UTC m=+21.854587552 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret") pod "global-pull-secret-syncer-pccgh" (UID: "4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:10:01.493260 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:01.493227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:01.493599 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:01.493227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:01.493599 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:01.493327 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:01.493599 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:01.493392 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:01.493599 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:01.493227 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:01.493599 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:01.493500 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:03.492977 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:03.492927 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:03.492977 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:03.492958 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:03.493514 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:03.492967 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:03.493514 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:03.493061 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:03.493514 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:03.493131 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:03.493514 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:03.493235 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:04.116641 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:04.116601 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:04.116861 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:04.116725 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:04.116861 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:04.116808 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:20.116787668 +0000 UTC m=+34.183959550 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 19 12:10:04.217624 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:04.217585 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:04.217820 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:04.217778 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 19 12:10:04.217820 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:04.217801 2572 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 19 12:10:04.217820 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:04.217811 2572 projected.go:194] Error preparing data for projected volume kube-api-access-67tn8 for pod openshift-network-diagnostics/network-check-target-2v66h: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:04.217955 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:04.217870 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8 podName:29c3436d-ad1d-4386-a9d8-894a3c87dbfc nodeName:}" failed. No retries permitted until 2026-04-19 12:10:20.217851519 +0000 UTC m=+34.285023404 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-67tn8" (UniqueName: "kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8") pod "network-check-target-2v66h" (UID: "29c3436d-ad1d-4386-a9d8-894a3c87dbfc") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 19 12:10:05.493921 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:05.493887 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:05.494393 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:05.493914 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:05.494393 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:05.493894 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:05.494393 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:05.494004 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:05.494393 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:05.494103 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:05.494393 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:05.494182 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:06.588156 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.587926 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" event={"ID":"a069367b-8d8c-414b-b7f7-af6d4738d3ba","Type":"ContainerStarted","Data":"e0d228f84d5477beb6bda6f70f40d5440bd193bade79bbfbf6c58fa218bf46eb"} Apr 19 12:10:06.589176 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.589154 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" event={"ID":"ce238b24-1370-4ed8-afca-1aa7573e56b7","Type":"ContainerStarted","Data":"6a42bc249c58e3a998aff52349ae9e5b30afd4fa69a4cddae29c369a7d1c044c"} Apr 19 12:10:06.590423 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.590402 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m897s" event={"ID":"d6bccaec-31af-4abb-8335-372cc94da827","Type":"ContainerStarted","Data":"30de6729679ebc431c8d383ddf0f722a37e426497672c7aded04fe992815239c"} Apr 19 12:10:06.591582 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.591562 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-cpvgj" event={"ID":"afd1b409-8b1c-4984-996b-e66960d52ffc","Type":"ContainerStarted","Data":"3be573dc422885426f318587f1a237390cfc6b5dd73a968e0e8c29e53cf8fa74"} Apr 19 12:10:06.593057 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.593032 2572 generic.go:358] "Generic (PLEG): container finished" podID="7944a8e8-84c1-4ed9-81a3-27e357098d48" containerID="5c2af02835ac80069b049be964d24244014e1c25d3cedfe6c872e9436e6822d0" exitCode=0 Apr 19 12:10:06.593140 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.593118 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerDied","Data":"5c2af02835ac80069b049be964d24244014e1c25d3cedfe6c872e9436e6822d0"} Apr 19 12:10:06.594677 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.594658 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"f0d2e673a6ed31da56632b8c1e8151332ae78ac62bcd49526450df4f3c282c08"} Apr 19 12:10:06.594742 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.594684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"63565b224088a8cbc211c847189e5d7bd5e7a62f0ad93c2aac6962af528c4773"} Apr 19 12:10:06.595793 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.595745 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wbtqf" event={"ID":"a78a63a6-1b78-4253-87ed-341aaf44e16f","Type":"ContainerStarted","Data":"0308388ff2ea30e6bf516cbba8a827a1f1f249ab15e2303a2d83fa7184541a85"} Apr 19 12:10:06.597739 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.597440 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mdl6z" event={"ID":"5982f902-227e-4317-9969-a90aa85d5e40","Type":"ContainerStarted","Data":"405b5a190b96625987b7c42b9ccb389bfe4244432acf1c469696991a123dbf7b"} Apr 19 12:10:06.604415 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.604360 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-ls4sq" podStartSLOduration=3.743522201 podStartE2EDuration="20.604343057s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.112259551 +0000 UTC m=+3.179431428" lastFinishedPulling="2026-04-19 12:10:05.973080408 +0000 UTC m=+20.040252284" observedRunningTime="2026-04-19 12:10:06.603446954 +0000 UTC m=+20.670618848" watchObservedRunningTime="2026-04-19 12:10:06.604343057 +0000 UTC m=+20.671514949" Apr 19 12:10:06.617775 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.617728 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m897s" podStartSLOduration=3.75660382 podStartE2EDuration="20.617715005s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.111949159 +0000 UTC m=+3.179121028" lastFinishedPulling="2026-04-19 12:10:05.973060324 +0000 UTC m=+20.040232213" observedRunningTime="2026-04-19 12:10:06.617623405 +0000 UTC m=+20.684795297" watchObservedRunningTime="2026-04-19 12:10:06.617715005 +0000 UTC m=+20.684886897" Apr 19 12:10:06.648638 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.648588 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mdl6z" podStartSLOduration=3.776046923 podStartE2EDuration="20.648571223s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.100665095 +0000 UTC m=+3.167836978" lastFinishedPulling="2026-04-19 12:10:05.973189409 +0000 UTC m=+20.040361278" observedRunningTime="2026-04-19 12:10:06.648147992 +0000 UTC m=+20.715319882" watchObservedRunningTime="2026-04-19 12:10:06.648571223 +0000 UTC m=+20.715743142" Apr 19 12:10:06.687493 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.687452 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wbtqf" podStartSLOduration=3.815351474 podStartE2EDuration="20.687436774s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.100976305 +0000 UTC m=+3.168148176" lastFinishedPulling="2026-04-19 12:10:05.973061588 +0000 UTC m=+20.040233476" observedRunningTime="2026-04-19 12:10:06.686851795 +0000 UTC m=+20.754023686" watchObservedRunningTime="2026-04-19 12:10:06.687436774 +0000 UTC m=+20.754608656" Apr 19 12:10:06.688167 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:06.688137 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-cpvgj" podStartSLOduration=3.670108396 podStartE2EDuration="20.688128707s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.11165317 +0000 UTC m=+3.178825038" lastFinishedPulling="2026-04-19 12:10:06.129673469 +0000 UTC m=+20.196845349" observedRunningTime="2026-04-19 12:10:06.663076426 +0000 UTC m=+20.730248316" watchObservedRunningTime="2026-04-19 12:10:06.688128707 +0000 UTC m=+20.755300603" Apr 19 12:10:07.184172 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.183972 2572 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 19 12:10:07.433373 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.433210 2572 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-19T12:10:07.184167245Z","UUID":"e623db45-3ea8-4aa4-98f2-e2375811bddf","Handler":null,"Name":"","Endpoint":""} Apr 19 12:10:07.435178 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.435148 2572 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 19 12:10:07.435178 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.435182 2572 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 19 12:10:07.493195 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.493164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:07.493367 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.493164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:07.493367 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:07.493273 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:07.493497 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.493164 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:07.493497 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:07.493406 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:07.493598 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:07.493491 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:07.603187 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.603104 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"e9a95575cba2aea5b38abc7fe47fb041a64c654da043e9e45a18ee641a6be923"} Apr 19 12:10:07.603187 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.603147 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"710aba92fb29e42d695d9b3d0edb76bc2a4d62c09ba25f26fe01545006a7e26a"} Apr 19 12:10:07.603187 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.603161 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"e67f0f35e48dbb874fbb88575bf5b70952eec2e7b548051a2b047fe7a048a8d5"} Apr 19 12:10:07.603187 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.603174 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"60e414a0951cd75a4fbcfdb1fc7d48159d60552baac459f9f32ba16afbdc0e71"} Apr 19 12:10:07.604559 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.604530 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-qbr6x" event={"ID":"6799f9ef-a53a-465c-ae04-e93d43acaea2","Type":"ContainerStarted","Data":"cf8ec53c242f7455352306d7c60cc9033c69079db88cda92c7c0edc561338478"} Apr 19 12:10:07.606386 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.606363 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" event={"ID":"a069367b-8d8c-414b-b7f7-af6d4738d3ba","Type":"ContainerStarted","Data":"9dea91137aa1fa5405e9ff5826ae6201f7056b0b901f0545eb1a95f64ca860d3"} Apr 19 12:10:07.618472 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.618423 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-qbr6x" podStartSLOduration=4.753585677 podStartE2EDuration="21.618407948s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.108552573 +0000 UTC m=+3.175724443" lastFinishedPulling="2026-04-19 12:10:05.973374834 +0000 UTC m=+20.040546714" observedRunningTime="2026-04-19 12:10:07.617560581 +0000 UTC m=+21.684732488" watchObservedRunningTime="2026-04-19 12:10:07.618407948 +0000 UTC m=+21.685579844" Apr 19 12:10:07.845096 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:07.845056 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:07.845312 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:07.845192 2572 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 19 12:10:07.845312 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:07.845274 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret podName:4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:23.845255037 +0000 UTC m=+37.912426910 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret") pod "global-pull-secret-syncer-pccgh" (UID: "4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82") : object "kube-system"/"original-pull-secret" not registered Apr 19 12:10:08.610455 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:08.610408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" event={"ID":"a069367b-8d8c-414b-b7f7-af6d4738d3ba","Type":"ContainerStarted","Data":"8e1373e0a3afe5deeb11d1ddfc83129d80571be34fae0f63a4d34cd514c0747d"} Apr 19 12:10:08.627659 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:08.627609 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-x62gp" podStartSLOduration=3.739527677 podStartE2EDuration="22.627593245s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.112865343 +0000 UTC m=+3.180037211" lastFinishedPulling="2026-04-19 12:10:08.000930896 +0000 UTC m=+22.068102779" observedRunningTime="2026-04-19 12:10:08.6273036 +0000 UTC m=+22.694475492" watchObservedRunningTime="2026-04-19 12:10:08.627593245 +0000 UTC m=+22.694765137" Apr 19 12:10:08.845009 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:08.844981 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:10:08.845530 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:08.845513 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:10:09.493639 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:09.493607 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:09.493871 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:09.493611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:09.493871 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:09.493722 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:09.493871 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:09.493611 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:09.494031 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:09.493966 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:09.494031 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:09.493837 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:09.616276 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:09.616216 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"8aef5c7f03fb810ca99913b16db47def68c4aa8ad62e5ebdfee3c27906446e15"} Apr 19 12:10:11.493831 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.493621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:11.494406 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.493621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:11.494406 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.493621 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:11.494406 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:11.493937 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:11.494406 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:11.494022 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:11.494406 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:11.494098 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:11.623456 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.623418 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" event={"ID":"473da951-aee0-495d-9114-913f8c86e8e0","Type":"ContainerStarted","Data":"0622c886f8698bafe9de624e301a9f981908647c4eafc1e992ad8f2b8388e797"} Apr 19 12:10:11.623775 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.623735 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:10:11.625242 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.625209 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerStarted","Data":"7ba525e597c7eaea3399608ee5fba1f20e6e9e75327fe9bda20d4e90f9d72939"} Apr 19 12:10:11.638421 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.638395 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:10:11.650706 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:11.650658 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" podStartSLOduration=8.361042033 podStartE2EDuration="25.650644632s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.104335257 +0000 UTC m=+3.171507126" lastFinishedPulling="2026-04-19 12:10:06.393937843 +0000 UTC m=+20.461109725" observedRunningTime="2026-04-19 12:10:11.648958585 +0000 UTC m=+25.716130475" watchObservedRunningTime="2026-04-19 12:10:11.650644632 +0000 UTC m=+25.717816522" Apr 19 12:10:12.633834 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:12.633785 2572 generic.go:358] "Generic (PLEG): container finished" podID="7944a8e8-84c1-4ed9-81a3-27e357098d48" containerID="7ba525e597c7eaea3399608ee5fba1f20e6e9e75327fe9bda20d4e90f9d72939" exitCode=0 Apr 19 12:10:12.634504 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:12.634460 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerDied","Data":"7ba525e597c7eaea3399608ee5fba1f20e6e9e75327fe9bda20d4e90f9d72939"} Apr 19 12:10:12.635080 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:12.635060 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:10:12.635232 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:12.635221 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:10:12.653072 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:12.653047 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:10:13.379058 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.378807 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pccgh"] Apr 19 12:10:13.379218 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.379076 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2v66h"] Apr 19 12:10:13.379218 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.379195 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:13.379314 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:13.379297 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:13.379750 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.379692 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:13.379897 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:13.379807 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:13.380359 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.380336 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-67dv9"] Apr 19 12:10:13.380465 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.380450 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:13.380612 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:13.380578 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:13.637399 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.637370 2572 generic.go:358] "Generic (PLEG): container finished" podID="7944a8e8-84c1-4ed9-81a3-27e357098d48" containerID="d003cdb8b21c92709a1caf0c7147269def08083e4cd8df35c6ce0494b5045c99" exitCode=0 Apr 19 12:10:13.637858 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:13.637461 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerDied","Data":"d003cdb8b21c92709a1caf0c7147269def08083e4cd8df35c6ce0494b5045c99"} Apr 19 12:10:14.285084 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:14.285059 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:10:14.285216 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:14.285179 2572 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 19 12:10:14.285610 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:14.285594 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wbtqf" Apr 19 12:10:14.493886 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:14.493807 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:14.494030 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:14.493894 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:14.640962 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:14.640935 2572 generic.go:358] "Generic (PLEG): container finished" podID="7944a8e8-84c1-4ed9-81a3-27e357098d48" containerID="0bcd13357a324d3cdf11dd4f8325c086b2e1ba5bbbd1fbe137d26b98621cfc55" exitCode=0 Apr 19 12:10:14.641319 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:14.641011 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerDied","Data":"0bcd13357a324d3cdf11dd4f8325c086b2e1ba5bbbd1fbe137d26b98621cfc55"} Apr 19 12:10:15.493471 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:15.493438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:15.493621 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:15.493438 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:15.493621 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:15.493566 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:15.494011 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:15.493980 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:16.494388 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:16.494357 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:16.495023 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:16.494467 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-2v66h" podUID="29c3436d-ad1d-4386-a9d8-894a3c87dbfc" Apr 19 12:10:17.493551 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.493512 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:17.493726 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.493564 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:17.493726 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:17.493679 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-67dv9" podUID="2235efd3-6b65-47c7-acb5-eb9aa104beb5" Apr 19 12:10:17.493851 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:17.493836 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-pccgh" podUID="4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82" Apr 19 12:10:17.720104 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.720073 2572 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-140-237.ec2.internal" event="NodeReady" Apr 19 12:10:17.720613 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.720213 2572 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 19 12:10:17.759146 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.759102 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-447kt"] Apr 19 12:10:17.779910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.779882 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-h7xrv"] Apr 19 12:10:17.780066 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.780050 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-447kt" Apr 19 12:10:17.782399 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.782365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 19 12:10:17.782518 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.782480 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 19 12:10:17.782573 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.782535 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7hvqw\"" Apr 19 12:10:17.794215 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.794187 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-447kt"] Apr 19 12:10:17.794320 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.794221 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h7xrv"] Apr 19 12:10:17.794387 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.794321 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:17.796785 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.796751 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 19 12:10:17.796785 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.796752 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zfmpv\"" Apr 19 12:10:17.797099 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.797059 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 19 12:10:17.797099 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.797071 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 19 12:10:17.922782 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.922721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq4x4\" (UniqueName: \"kubernetes.io/projected/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-kube-api-access-rq4x4\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:17.922958 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.922847 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pnwkk\" (UniqueName: \"kubernetes.io/projected/f8b2bdcd-f07f-423f-a401-b0743ef1671f-kube-api-access-pnwkk\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:17.922958 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.922884 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:17.922958 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.922906 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:17.922958 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.922930 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8b2bdcd-f07f-423f-a401-b0743ef1671f-config-volume\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:17.923093 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:17.922990 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8b2bdcd-f07f-423f-a401-b0743ef1671f-tmp-dir\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.023746 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.023677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pnwkk\" (UniqueName: \"kubernetes.io/projected/f8b2bdcd-f07f-423f-a401-b0743ef1671f-kube-api-access-pnwkk\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.023746 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.023713 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.023746 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.023742 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:18.023988 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.023795 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8b2bdcd-f07f-423f-a401-b0743ef1671f-config-volume\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.023988 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.023844 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8b2bdcd-f07f-423f-a401-b0743ef1671f-tmp-dir\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.023988 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.023857 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:18.023988 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.023899 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq4x4\" (UniqueName: \"kubernetes.io/projected/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-kube-api-access-rq4x4\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:18.023988 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.023931 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:18.523911206 +0000 UTC m=+32.591083087 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:18.024193 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.023986 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:18.024193 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.024053 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:18.524035834 +0000 UTC m=+32.591207707 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:18.024293 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.024273 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/f8b2bdcd-f07f-423f-a401-b0743ef1671f-tmp-dir\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.024514 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.024492 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f8b2bdcd-f07f-423f-a401-b0743ef1671f-config-volume\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.033678 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.033653 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pnwkk\" (UniqueName: \"kubernetes.io/projected/f8b2bdcd-f07f-423f-a401-b0743ef1671f-kube-api-access-pnwkk\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.033831 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.033725 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq4x4\" (UniqueName: \"kubernetes.io/projected/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-kube-api-access-rq4x4\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:18.493603 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.493520 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:18.496387 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.496360 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 19 12:10:18.496520 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.496467 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-d9pnm\"" Apr 19 12:10:18.496604 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.496365 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 19 12:10:18.528077 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.528045 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:18.528244 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:18.528087 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:18.528244 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.528212 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:18.528244 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.528241 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:18.528395 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.528289 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:19.528273813 +0000 UTC m=+33.595445694 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:18.528395 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:18.528306 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:19.528298991 +0000 UTC m=+33.595470859 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:19.493572 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.493536 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:19.494151 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.493541 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:19.496431 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.496409 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 19 12:10:19.496551 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.496443 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lsl4x\"" Apr 19 12:10:19.496551 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.496497 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 19 12:10:19.536119 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.536081 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:19.536119 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:19.536124 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:19.536352 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:19.536228 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:19.536352 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:19.536241 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:19.536352 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:19.536290 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:21.536271855 +0000 UTC m=+35.603443729 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:19.536352 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:19.536307 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:21.536298994 +0000 UTC m=+35.603470921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:20.139461 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:20.139422 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:20.139656 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:20.139585 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:10:20.139723 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:20.139665 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:52.139643697 +0000 UTC m=+66.206815583 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : secret "metrics-daemon-secret" not found Apr 19 12:10:20.240253 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:20.240214 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:20.243523 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:20.243499 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67tn8\" (UniqueName: \"kubernetes.io/projected/29c3436d-ad1d-4386-a9d8-894a3c87dbfc-kube-api-access-67tn8\") pod \"network-check-target-2v66h\" (UID: \"29c3436d-ad1d-4386-a9d8-894a3c87dbfc\") " pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:20.304591 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:20.304549 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:20.675805 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:20.675774 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-2v66h"] Apr 19 12:10:20.679509 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:10:20.679478 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c3436d_ad1d_4386_a9d8_894a3c87dbfc.slice/crio-668e6d6cb9d1abddf4385d5f56f4ee00ee82221c68480e2d965683f7a1a53f52 WatchSource:0}: Error finding container 668e6d6cb9d1abddf4385d5f56f4ee00ee82221c68480e2d965683f7a1a53f52: Status 404 returned error can't find the container with id 668e6d6cb9d1abddf4385d5f56f4ee00ee82221c68480e2d965683f7a1a53f52 Apr 19 12:10:21.550518 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:21.550477 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:21.550708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:21.550533 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:21.550708 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:21.550665 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:21.550826 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:21.550726 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:25.550709314 +0000 UTC m=+39.617881202 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:21.550826 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:21.550662 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:21.550918 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:21.550832 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:25.550813979 +0000 UTC m=+39.617985849 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:21.657952 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:21.657920 2572 generic.go:358] "Generic (PLEG): container finished" podID="7944a8e8-84c1-4ed9-81a3-27e357098d48" containerID="f25219e2810eedb29f948e4ab1e84c789a9fa5a6bda490ac8f9120c1852ad038" exitCode=0 Apr 19 12:10:21.658128 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:21.658004 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerDied","Data":"f25219e2810eedb29f948e4ab1e84c789a9fa5a6bda490ac8f9120c1852ad038"} Apr 19 12:10:21.659367 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:21.659115 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2v66h" event={"ID":"29c3436d-ad1d-4386-a9d8-894a3c87dbfc","Type":"ContainerStarted","Data":"668e6d6cb9d1abddf4385d5f56f4ee00ee82221c68480e2d965683f7a1a53f52"} Apr 19 12:10:22.663914 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:22.663720 2572 generic.go:358] "Generic (PLEG): container finished" podID="7944a8e8-84c1-4ed9-81a3-27e357098d48" containerID="3c2ca08818a699e0165a3334f995ecd563cadeed192542d3bd6a369d346f0dc0" exitCode=0 Apr 19 12:10:22.663914 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:22.663799 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerDied","Data":"3c2ca08818a699e0165a3334f995ecd563cadeed192542d3bd6a369d346f0dc0"} Apr 19 12:10:23.670012 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:23.669978 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" event={"ID":"7944a8e8-84c1-4ed9-81a3-27e357098d48","Type":"ContainerStarted","Data":"e40f9d8df3ec77f53bd540e4dd1aeb923c155c510972d21f0224a94baba7f4c8"} Apr 19 12:10:23.692302 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:23.692247 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-sqxd4" podStartSLOduration=5.968163849 podStartE2EDuration="37.69222947s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:09:49.1100382 +0000 UTC m=+3.177210069" lastFinishedPulling="2026-04-19 12:10:20.834103818 +0000 UTC m=+34.901275690" observedRunningTime="2026-04-19 12:10:23.691091356 +0000 UTC m=+37.758263250" watchObservedRunningTime="2026-04-19 12:10:23.69222947 +0000 UTC m=+37.759401364" Apr 19 12:10:23.868597 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:23.868564 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:23.872796 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:23.872746 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82-original-pull-secret\") pod \"global-pull-secret-syncer-pccgh\" (UID: \"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82\") " pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:24.004981 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:24.004939 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-pccgh" Apr 19 12:10:24.735965 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:24.735934 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-pccgh"] Apr 19 12:10:24.739440 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:10:24.739413 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bbc95aa_4b56_4ddc_ab2c_7b9abba9aa82.slice/crio-bdcf733f16c3936f3971eee5a4230e090375bd207c9d18b2ff24e89a774d7ab1 WatchSource:0}: Error finding container bdcf733f16c3936f3971eee5a4230e090375bd207c9d18b2ff24e89a774d7ab1: Status 404 returned error can't find the container with id bdcf733f16c3936f3971eee5a4230e090375bd207c9d18b2ff24e89a774d7ab1 Apr 19 12:10:25.584361 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:25.584122 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:25.584574 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:25.584393 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:25.584574 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:25.584251 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:25.584574 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:25.584513 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:33.584493505 +0000 UTC m=+47.651665377 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:25.584574 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:25.584518 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:25.584574 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:25.584567 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:33.58455148 +0000 UTC m=+47.651723354 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:25.676395 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:25.676349 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-2v66h" event={"ID":"29c3436d-ad1d-4386-a9d8-894a3c87dbfc","Type":"ContainerStarted","Data":"d5bff38041d669e852cb84d5be7049b6f7308639d98ed891207d2828eb67c1b8"} Apr 19 12:10:25.676575 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:25.676454 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:10:25.677699 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:25.677668 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pccgh" event={"ID":"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82","Type":"ContainerStarted","Data":"bdcf733f16c3936f3971eee5a4230e090375bd207c9d18b2ff24e89a774d7ab1"} Apr 19 12:10:25.690628 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:25.690550 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-2v66h" podStartSLOduration=35.398702588 podStartE2EDuration="39.690534119s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:10:20.684536451 +0000 UTC m=+34.751708319" lastFinishedPulling="2026-04-19 12:10:24.97636798 +0000 UTC m=+39.043539850" observedRunningTime="2026-04-19 12:10:25.690078125 +0000 UTC m=+39.757250020" watchObservedRunningTime="2026-04-19 12:10:25.690534119 +0000 UTC m=+39.757706012" Apr 19 12:10:29.686118 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:29.686076 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-pccgh" event={"ID":"4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82","Type":"ContainerStarted","Data":"cf59ccbab9bd4083e3263ee70e7760666606905bb27faedb38cb77c57ad16cd5"} Apr 19 12:10:33.641719 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:33.641677 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:33.642088 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:33.641801 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:33.642088 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:33.641857 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:33.642088 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:33.641878 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:33.642088 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:33.641928 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:10:49.641913134 +0000 UTC m=+63.709085007 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:33.642088 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:33.641944 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:10:49.641938876 +0000 UTC m=+63.709110745 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:44.654935 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:44.654899 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-mfzf5" Apr 19 12:10:44.680651 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:44.680605 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-pccgh" podStartSLOduration=48.658303766 podStartE2EDuration="52.680592588s" podCreationTimestamp="2026-04-19 12:09:52 +0000 UTC" firstStartedPulling="2026-04-19 12:10:24.741501228 +0000 UTC m=+38.808673101" lastFinishedPulling="2026-04-19 12:10:28.76379004 +0000 UTC m=+42.830961923" observedRunningTime="2026-04-19 12:10:29.699574197 +0000 UTC m=+43.766746091" watchObservedRunningTime="2026-04-19 12:10:44.680592588 +0000 UTC m=+58.747764478" Apr 19 12:10:49.649316 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:49.649256 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:10:49.649316 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:49.649318 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:10:49.649789 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:49.649398 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:10:49.649789 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:49.649459 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:10:49.649789 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:49.649474 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:11:21.649458589 +0000 UTC m=+95.716630462 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:10:49.649789 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:49.649505 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:21.649493026 +0000 UTC m=+95.716664898 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:10:52.164665 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:52.164620 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:10:52.165084 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:52.164775 2572 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 19 12:10:52.165084 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:10:52.164838 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs podName:2235efd3-6b65-47c7-acb5-eb9aa104beb5 nodeName:}" failed. No retries permitted until 2026-04-19 12:11:56.164822696 +0000 UTC m=+130.231994565 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs") pod "network-metrics-daemon-67dv9" (UID: "2235efd3-6b65-47c7-acb5-eb9aa104beb5") : secret "metrics-daemon-secret" not found Apr 19 12:10:56.681369 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:10:56.681340 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-2v66h" Apr 19 12:11:01.514749 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.514714 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d"] Apr 19 12:11:01.547610 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.547582 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d"] Apr 19 12:11:01.547610 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.547611 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh"] Apr 19 12:11:01.547865 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.547743 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.550317 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.550290 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 19 12:11:01.550448 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.550299 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 19 12:11:01.550448 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.550432 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 19 12:11:01.550590 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.550573 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 19 12:11:01.571718 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.571689 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh"] Apr 19 12:11:01.571845 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.571728 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.574452 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.574430 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 19 12:11:01.574554 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.574517 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 19 12:11:01.574554 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.574539 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 19 12:11:01.574779 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.574751 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 19 12:11:01.627495 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627465 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dc2764a-df76-47cf-b032-3b6372c30c04-tmp\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.627646 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627514 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2nbg\" (UniqueName: \"kubernetes.io/projected/3dc2764a-df76-47cf-b032-3b6372c30c04-kube-api-access-b2nbg\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.627646 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627548 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/44839d59-49b3-4c2d-a204-f600cf1975bb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.627646 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627610 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.627646 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627637 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-ca\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.627801 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627714 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q94mv\" (UniqueName: \"kubernetes.io/projected/44839d59-49b3-4c2d-a204-f600cf1975bb-kube-api-access-q94mv\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.627801 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627755 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.627862 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627825 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3dc2764a-df76-47cf-b032-3b6372c30c04-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.627862 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.627853 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-hub\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.729182 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729140 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dc2764a-df76-47cf-b032-3b6372c30c04-tmp\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.729182 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729181 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b2nbg\" (UniqueName: \"kubernetes.io/projected/3dc2764a-df76-47cf-b032-3b6372c30c04-kube-api-access-b2nbg\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.729391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729204 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/44839d59-49b3-4c2d-a204-f600cf1975bb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.729391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729253 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.729391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729279 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-ca\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.729391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729322 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q94mv\" (UniqueName: \"kubernetes.io/projected/44839d59-49b3-4c2d-a204-f600cf1975bb-kube-api-access-q94mv\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.729391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729361 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.729391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729391 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3dc2764a-df76-47cf-b032-3b6372c30c04-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.729974 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.729421 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-hub\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.730055 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.730026 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/44839d59-49b3-4c2d-a204-f600cf1975bb-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.732901 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.732868 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/3dc2764a-df76-47cf-b032-3b6372c30c04-klusterlet-config\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.733032 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.733010 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.733085 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.733068 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-ca\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.733122 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.733099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.733122 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.733099 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/44839d59-49b3-4c2d-a204-f600cf1975bb-hub\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.736515 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.736489 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2nbg\" (UniqueName: \"kubernetes.io/projected/3dc2764a-df76-47cf-b032-3b6372c30c04-kube-api-access-b2nbg\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.736796 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.736752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q94mv\" (UniqueName: \"kubernetes.io/projected/44839d59-49b3-4c2d-a204-f600cf1975bb-kube-api-access-q94mv\") pod \"cluster-proxy-proxy-agent-859549b7f4-k2mkh\" (UID: \"44839d59-49b3-4c2d-a204-f600cf1975bb\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.740359 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.740340 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/3dc2764a-df76-47cf-b032-3b6372c30c04-tmp\") pod \"klusterlet-addon-workmgr-6cdfd7cb57-2gk2d\" (UID: \"3dc2764a-df76-47cf-b032-3b6372c30c04\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.856945 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.856860 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:01.890639 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.890602 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:11:01.977795 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:01.977748 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d"] Apr 19 12:11:01.981857 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:11:01.981825 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dc2764a_df76_47cf_b032_3b6372c30c04.slice/crio-53a56a126dd84b6b029ff00247db4da1827ba6f35c481f97fa8ea5f31d0d6395 WatchSource:0}: Error finding container 53a56a126dd84b6b029ff00247db4da1827ba6f35c481f97fa8ea5f31d0d6395: Status 404 returned error can't find the container with id 53a56a126dd84b6b029ff00247db4da1827ba6f35c481f97fa8ea5f31d0d6395 Apr 19 12:11:02.021709 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:02.020595 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh"] Apr 19 12:11:02.025448 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:11:02.025419 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod44839d59_49b3_4c2d_a204_f600cf1975bb.slice/crio-de8af02a2776419c84395d0d5b193a9ea348afb6d644858f60458de6455ecdc6 WatchSource:0}: Error finding container de8af02a2776419c84395d0d5b193a9ea348afb6d644858f60458de6455ecdc6: Status 404 returned error can't find the container with id de8af02a2776419c84395d0d5b193a9ea348afb6d644858f60458de6455ecdc6 Apr 19 12:11:02.750163 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:02.750016 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" event={"ID":"44839d59-49b3-4c2d-a204-f600cf1975bb","Type":"ContainerStarted","Data":"de8af02a2776419c84395d0d5b193a9ea348afb6d644858f60458de6455ecdc6"} Apr 19 12:11:02.752512 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:02.752470 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" event={"ID":"3dc2764a-df76-47cf-b032-3b6372c30c04","Type":"ContainerStarted","Data":"53a56a126dd84b6b029ff00247db4da1827ba6f35c481f97fa8ea5f31d0d6395"} Apr 19 12:11:06.763643 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:06.763608 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" event={"ID":"44839d59-49b3-4c2d-a204-f600cf1975bb","Type":"ContainerStarted","Data":"2bcc65c517864c0d9e9b17e2e306ca1587ada2982ddc03593675f780f2e35917"} Apr 19 12:11:06.764786 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:06.764751 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" event={"ID":"3dc2764a-df76-47cf-b032-3b6372c30c04","Type":"ContainerStarted","Data":"d30472f88c51b90141f80a8408ad2a8c17ca4b511e7f60bbc3913b2a3869ad4a"} Apr 19 12:11:06.765055 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:06.765032 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:06.766637 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:06.766620 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" Apr 19 12:11:06.779930 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:06.779885 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-6cdfd7cb57-2gk2d" podStartSLOduration=1.454170723 podStartE2EDuration="5.779873756s" podCreationTimestamp="2026-04-19 12:11:01 +0000 UTC" firstStartedPulling="2026-04-19 12:11:01.983662957 +0000 UTC m=+76.050834825" lastFinishedPulling="2026-04-19 12:11:06.309365985 +0000 UTC m=+80.376537858" observedRunningTime="2026-04-19 12:11:06.778723873 +0000 UTC m=+80.845895761" watchObservedRunningTime="2026-04-19 12:11:06.779873756 +0000 UTC m=+80.847045646" Apr 19 12:11:08.771164 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:08.771125 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" event={"ID":"44839d59-49b3-4c2d-a204-f600cf1975bb","Type":"ContainerStarted","Data":"bc41bb2560021ea8e8d6eaba7888489c218f8831b3f2c77f91f78b8c4eb926ac"} Apr 19 12:11:08.771534 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:08.771171 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" event={"ID":"44839d59-49b3-4c2d-a204-f600cf1975bb","Type":"ContainerStarted","Data":"7f540f63b472220163a193de85e451b1946d2574395d8a5ed492f653a1e7168b"} Apr 19 12:11:08.789484 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:08.789440 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" podStartSLOduration=1.54880573 podStartE2EDuration="7.789425719s" podCreationTimestamp="2026-04-19 12:11:01 +0000 UTC" firstStartedPulling="2026-04-19 12:11:02.027112568 +0000 UTC m=+76.094284437" lastFinishedPulling="2026-04-19 12:11:08.267732557 +0000 UTC m=+82.334904426" observedRunningTime="2026-04-19 12:11:08.787933737 +0000 UTC m=+82.855105641" watchObservedRunningTime="2026-04-19 12:11:08.789425719 +0000 UTC m=+82.856597607" Apr 19 12:11:21.672611 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:21.672579 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:11:21.672611 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:21.672614 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:11:21.673005 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:11:21.672716 2572 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 19 12:11:21.673005 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:11:21.672785 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert podName:105ea9fb-4797-44a9-a30b-6d58c96ae4d6 nodeName:}" failed. No retries permitted until 2026-04-19 12:12:25.672753566 +0000 UTC m=+159.739925436 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert") pod "ingress-canary-h7xrv" (UID: "105ea9fb-4797-44a9-a30b-6d58c96ae4d6") : secret "canary-serving-cert" not found Apr 19 12:11:21.673005 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:11:21.672718 2572 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 19 12:11:21.673005 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:11:21.672842 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls podName:f8b2bdcd-f07f-423f-a401-b0743ef1671f nodeName:}" failed. No retries permitted until 2026-04-19 12:12:25.67283248 +0000 UTC m=+159.740004353 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls") pod "dns-default-447kt" (UID: "f8b2bdcd-f07f-423f-a401-b0743ef1671f") : secret "dns-default-metrics-tls" not found Apr 19 12:11:26.359654 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:26.359620 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m897s_d6bccaec-31af-4abb-8335-372cc94da827/dns-node-resolver/0.log" Apr 19 12:11:26.759467 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:26.759444 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mdl6z_5982f902-227e-4317-9969-a90aa85d5e40/node-ca/0.log" Apr 19 12:11:54.542403 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.542367 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-56f9f776b7-q295c"] Apr 19 12:11:54.548340 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.548319 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.550609 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.550591 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 19 12:11:54.550711 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.550700 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-rh992\"" Apr 19 12:11:54.551815 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.551795 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 19 12:11:54.551915 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.551890 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 19 12:11:54.554284 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.554258 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6s87n"] Apr 19 12:11:54.556403 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.556384 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 19 12:11:54.557175 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.557158 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.557584 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.557567 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56f9f776b7-q295c"] Apr 19 12:11:54.560025 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.560008 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 19 12:11:54.560195 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.560179 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-k9qgl\"" Apr 19 12:11:54.560452 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.560438 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 19 12:11:54.560523 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.560450 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 19 12:11:54.561295 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.561273 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 19 12:11:54.570441 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.570421 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6s87n"] Apr 19 12:11:54.685712 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685688 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/42a63549-4ec6-4c51-bb2e-54c4888fadad-data-volume\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.685840 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685721 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/42a63549-4ec6-4c51-bb2e-54c4888fadad-crio-socket\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.685840 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685754 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a059cfb-bede-4767-80e4-ef7061d18b06-trusted-ca\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.685840 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685813 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8a059cfb-bede-4767-80e4-ef7061d18b06-installation-pull-secrets\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.686004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685840 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzlb\" (UniqueName: \"kubernetes.io/projected/42a63549-4ec6-4c51-bb2e-54c4888fadad-kube-api-access-mjzlb\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.686004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685865 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8a059cfb-bede-4767-80e4-ef7061d18b06-image-registry-private-configuration\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.686004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685903 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8a059cfb-bede-4767-80e4-ef7061d18b06-registry-certificates\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.686004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685964 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8a059cfb-bede-4767-80e4-ef7061d18b06-ca-trust-extracted\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.686004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.685994 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-bound-sa-token\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.686243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.686051 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/42a63549-4ec6-4c51-bb2e-54c4888fadad-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.686243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.686094 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-registry-tls\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.686243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.686120 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/42a63549-4ec6-4c51-bb2e-54c4888fadad-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.686243 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.686172 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qxdx9\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-kube-api-access-qxdx9\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786539 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786517 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8a059cfb-bede-4767-80e4-ef7061d18b06-registry-certificates\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786548 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8a059cfb-bede-4767-80e4-ef7061d18b06-ca-trust-extracted\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786568 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-bound-sa-token\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786591 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/42a63549-4ec6-4c51-bb2e-54c4888fadad-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.786644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786617 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-registry-tls\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786634 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/42a63549-4ec6-4c51-bb2e-54c4888fadad-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qxdx9\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-kube-api-access-qxdx9\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/42a63549-4ec6-4c51-bb2e-54c4888fadad-data-volume\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786729 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/42a63549-4ec6-4c51-bb2e-54c4888fadad-crio-socket\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786778 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a059cfb-bede-4767-80e4-ef7061d18b06-trusted-ca\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786817 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8a059cfb-bede-4767-80e4-ef7061d18b06-installation-pull-secrets\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786842 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzlb\" (UniqueName: \"kubernetes.io/projected/42a63549-4ec6-4c51-bb2e-54c4888fadad-kube-api-access-mjzlb\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786868 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8a059cfb-bede-4767-80e4-ef7061d18b06-image-registry-private-configuration\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.786936 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.786893 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/42a63549-4ec6-4c51-bb2e-54c4888fadad-crio-socket\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.787350 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.787039 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8a059cfb-bede-4767-80e4-ef7061d18b06-ca-trust-extracted\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.787413 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.787390 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/42a63549-4ec6-4c51-bb2e-54c4888fadad-data-volume\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.787628 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.787597 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8a059cfb-bede-4767-80e4-ef7061d18b06-registry-certificates\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.787750 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.787729 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/42a63549-4ec6-4c51-bb2e-54c4888fadad-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.788029 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.788008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8a059cfb-bede-4767-80e4-ef7061d18b06-trusted-ca\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.789654 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.789629 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/42a63549-4ec6-4c51-bb2e-54c4888fadad-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.789725 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.789648 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8a059cfb-bede-4767-80e4-ef7061d18b06-installation-pull-secrets\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.789856 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.789835 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8a059cfb-bede-4767-80e4-ef7061d18b06-image-registry-private-configuration\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.789901 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.789841 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-registry-tls\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.795438 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.795385 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzlb\" (UniqueName: \"kubernetes.io/projected/42a63549-4ec6-4c51-bb2e-54c4888fadad-kube-api-access-mjzlb\") pod \"insights-runtime-extractor-6s87n\" (UID: \"42a63549-4ec6-4c51-bb2e-54c4888fadad\") " pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.795527 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.795457 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qxdx9\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-kube-api-access-qxdx9\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.795527 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.795507 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a059cfb-bede-4767-80e4-ef7061d18b06-bound-sa-token\") pod \"image-registry-56f9f776b7-q295c\" (UID: \"8a059cfb-bede-4767-80e4-ef7061d18b06\") " pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.858224 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.858202 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:54.865875 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.865853 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6s87n" Apr 19 12:11:54.983034 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.982978 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-56f9f776b7-q295c"] Apr 19 12:11:54.985610 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:11:54.985584 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a059cfb_bede_4767_80e4_ef7061d18b06.slice/crio-39405818700c3ca2cedb24755945f5bd80bb73566b1ea4c7a65f8625f441eadd WatchSource:0}: Error finding container 39405818700c3ca2cedb24755945f5bd80bb73566b1ea4c7a65f8625f441eadd: Status 404 returned error can't find the container with id 39405818700c3ca2cedb24755945f5bd80bb73566b1ea4c7a65f8625f441eadd Apr 19 12:11:54.997570 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:54.997549 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6s87n"] Apr 19 12:11:55.000000 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:11:54.999976 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42a63549_4ec6_4c51_bb2e_54c4888fadad.slice/crio-0117f5b84103a13e90735b513e876a25bd40cd3c361d3e8c012d82b8fca4e7ba WatchSource:0}: Error finding container 0117f5b84103a13e90735b513e876a25bd40cd3c361d3e8c012d82b8fca4e7ba: Status 404 returned error can't find the container with id 0117f5b84103a13e90735b513e876a25bd40cd3c361d3e8c012d82b8fca4e7ba Apr 19 12:11:55.884534 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.884509 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" event={"ID":"8a059cfb-bede-4767-80e4-ef7061d18b06","Type":"ContainerStarted","Data":"7ecdc5979fb63360201e9b51ecd014e121aeb0b19a30f0715263f96b841c9f68"} Apr 19 12:11:55.884944 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.884542 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" event={"ID":"8a059cfb-bede-4767-80e4-ef7061d18b06","Type":"ContainerStarted","Data":"39405818700c3ca2cedb24755945f5bd80bb73566b1ea4c7a65f8625f441eadd"} Apr 19 12:11:55.884944 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.884649 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:11:55.886093 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.886075 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6s87n" event={"ID":"42a63549-4ec6-4c51-bb2e-54c4888fadad","Type":"ContainerStarted","Data":"c45e8558144188b5ada813958f6d312c38a19b0606e22146180c6123031232e4"} Apr 19 12:11:55.886157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.886101 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6s87n" event={"ID":"42a63549-4ec6-4c51-bb2e-54c4888fadad","Type":"ContainerStarted","Data":"82231318fc858c5ffe2eb89113c257a235ccc9368b56e1f5ea265aa69309806c"} Apr 19 12:11:55.886157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.886114 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6s87n" event={"ID":"42a63549-4ec6-4c51-bb2e-54c4888fadad","Type":"ContainerStarted","Data":"0117f5b84103a13e90735b513e876a25bd40cd3c361d3e8c012d82b8fca4e7ba"} Apr 19 12:11:55.905424 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:55.905386 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podStartSLOduration=1.905377103 podStartE2EDuration="1.905377103s" podCreationTimestamp="2026-04-19 12:11:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:11:55.90398614 +0000 UTC m=+129.971158030" watchObservedRunningTime="2026-04-19 12:11:55.905377103 +0000 UTC m=+129.972548995" Apr 19 12:11:56.197966 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:56.197928 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:11:56.200141 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:56.200119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2235efd3-6b65-47c7-acb5-eb9aa104beb5-metrics-certs\") pod \"network-metrics-daemon-67dv9\" (UID: \"2235efd3-6b65-47c7-acb5-eb9aa104beb5\") " pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:11:56.412403 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:56.412378 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lsl4x\"" Apr 19 12:11:56.420445 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:56.420424 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-67dv9" Apr 19 12:11:56.532136 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:56.532108 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-67dv9"] Apr 19 12:11:56.536253 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:11:56.536227 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2235efd3_6b65_47c7_acb5_eb9aa104beb5.slice/crio-1e719c183ad3b7fe5d727b08e7fda0ed8be0d86da11cb2408cdb7ea6e245d21a WatchSource:0}: Error finding container 1e719c183ad3b7fe5d727b08e7fda0ed8be0d86da11cb2408cdb7ea6e245d21a: Status 404 returned error can't find the container with id 1e719c183ad3b7fe5d727b08e7fda0ed8be0d86da11cb2408cdb7ea6e245d21a Apr 19 12:11:56.889776 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:56.889726 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-67dv9" event={"ID":"2235efd3-6b65-47c7-acb5-eb9aa104beb5","Type":"ContainerStarted","Data":"1e719c183ad3b7fe5d727b08e7fda0ed8be0d86da11cb2408cdb7ea6e245d21a"} Apr 19 12:11:57.893496 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:57.893408 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-67dv9" event={"ID":"2235efd3-6b65-47c7-acb5-eb9aa104beb5","Type":"ContainerStarted","Data":"99848be2b8931a4fb68caf9dc18b6d868640e5cfdb25d4751c20c7188e0e5618"} Apr 19 12:11:58.898218 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:58.898180 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6s87n" event={"ID":"42a63549-4ec6-4c51-bb2e-54c4888fadad","Type":"ContainerStarted","Data":"130c4929a4d166dd6b5807beb10eaa147c81e5686b0f915aade677249c47b6f9"} Apr 19 12:11:58.899562 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:58.899527 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-67dv9" event={"ID":"2235efd3-6b65-47c7-acb5-eb9aa104beb5","Type":"ContainerStarted","Data":"6a7b419260b5c5a1eaf8c746d2964ab3e55b9bd845971ab5516ab5ad05165f88"} Apr 19 12:11:58.915454 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:58.915414 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6s87n" podStartSLOduration=1.908674539 podStartE2EDuration="4.915400823s" podCreationTimestamp="2026-04-19 12:11:54 +0000 UTC" firstStartedPulling="2026-04-19 12:11:55.046240761 +0000 UTC m=+129.113412638" lastFinishedPulling="2026-04-19 12:11:58.052967038 +0000 UTC m=+132.120138922" observedRunningTime="2026-04-19 12:11:58.913488968 +0000 UTC m=+132.980660871" watchObservedRunningTime="2026-04-19 12:11:58.915400823 +0000 UTC m=+132.982572714" Apr 19 12:11:58.926673 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:11:58.926629 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-67dv9" podStartSLOduration=131.919087205 podStartE2EDuration="2m12.926617309s" podCreationTimestamp="2026-04-19 12:09:46 +0000 UTC" firstStartedPulling="2026-04-19 12:11:56.538071084 +0000 UTC m=+130.605242954" lastFinishedPulling="2026-04-19 12:11:57.54560118 +0000 UTC m=+131.612773058" observedRunningTime="2026-04-19 12:11:58.926451393 +0000 UTC m=+132.993623286" watchObservedRunningTime="2026-04-19 12:11:58.926617309 +0000 UTC m=+132.993789198" Apr 19 12:12:01.961134 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.961095 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-p8pqx"] Apr 19 12:12:01.965664 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.965648 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:01.967927 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.967882 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 19 12:12:01.968155 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.968133 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 19 12:12:01.968262 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.968247 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 19 12:12:01.968313 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.968263 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 19 12:12:01.969415 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.969398 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-lgltw\"" Apr 19 12:12:01.969611 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.969542 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 19 12:12:01.969611 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:01.969458 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 19 12:12:02.037016 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.036989 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-root\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037016 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037019 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-node-exporter-wtmp\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037041 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f904272-e411-4995-aacd-579497cea66e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037088 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrr9r\" (UniqueName: \"kubernetes.io/projected/8f904272-e411-4995-aacd-579497cea66e-kube-api-access-wrr9r\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037122 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-sys\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f904272-e411-4995-aacd-579497cea66e-node-exporter-tls\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037278 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8f904272-e411-4995-aacd-579497cea66e-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037278 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037196 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f904272-e411-4995-aacd-579497cea66e-metrics-client-ca\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.037278 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.037215 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8f904272-e411-4995-aacd-579497cea66e-node-exporter-textfile\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138436 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138401 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-sys\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f904272-e411-4995-aacd-579497cea66e-node-exporter-tls\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8f904272-e411-4995-aacd-579497cea66e-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138497 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f904272-e411-4995-aacd-579497cea66e-metrics-client-ca\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138520 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8f904272-e411-4995-aacd-579497cea66e-node-exporter-textfile\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138518 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-sys\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138562 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-root\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138583 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-node-exporter-wtmp\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138933 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138631 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-root\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138933 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138671 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f904272-e411-4995-aacd-579497cea66e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138933 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138696 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/8f904272-e411-4995-aacd-579497cea66e-node-exporter-wtmp\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.138933 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138712 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wrr9r\" (UniqueName: \"kubernetes.io/projected/8f904272-e411-4995-aacd-579497cea66e-kube-api-access-wrr9r\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.139088 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.138924 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/8f904272-e411-4995-aacd-579497cea66e-node-exporter-textfile\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.139192 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.139176 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/8f904272-e411-4995-aacd-579497cea66e-node-exporter-accelerators-collector-config\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.139706 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.139686 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/8f904272-e411-4995-aacd-579497cea66e-metrics-client-ca\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.140895 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.140878 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/8f904272-e411-4995-aacd-579497cea66e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.140975 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.140960 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/8f904272-e411-4995-aacd-579497cea66e-node-exporter-tls\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.150271 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.150249 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrr9r\" (UniqueName: \"kubernetes.io/projected/8f904272-e411-4995-aacd-579497cea66e-kube-api-access-wrr9r\") pod \"node-exporter-p8pqx\" (UID: \"8f904272-e411-4995-aacd-579497cea66e\") " pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.274641 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.274612 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-p8pqx" Apr 19 12:12:02.282573 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:12:02.282542 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f904272_e411_4995_aacd_579497cea66e.slice/crio-b78a0a96d8d6e48a36244a1925fe544940f9b82a3e39a5c8bf47cd0eee3f3c98 WatchSource:0}: Error finding container b78a0a96d8d6e48a36244a1925fe544940f9b82a3e39a5c8bf47cd0eee3f3c98: Status 404 returned error can't find the container with id b78a0a96d8d6e48a36244a1925fe544940f9b82a3e39a5c8bf47cd0eee3f3c98 Apr 19 12:12:02.911487 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:02.911455 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8pqx" event={"ID":"8f904272-e411-4995-aacd-579497cea66e","Type":"ContainerStarted","Data":"b78a0a96d8d6e48a36244a1925fe544940f9b82a3e39a5c8bf47cd0eee3f3c98"} Apr 19 12:12:03.918647 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:03.918612 2572 generic.go:358] "Generic (PLEG): container finished" podID="8f904272-e411-4995-aacd-579497cea66e" containerID="342276d476ccb3f1350d41a66119bbe422bddb95dc3b200e8e3f5fce425b99f9" exitCode=0 Apr 19 12:12:03.919037 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:03.918692 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8pqx" event={"ID":"8f904272-e411-4995-aacd-579497cea66e","Type":"ContainerDied","Data":"342276d476ccb3f1350d41a66119bbe422bddb95dc3b200e8e3f5fce425b99f9"} Apr 19 12:12:04.923298 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:04.923263 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8pqx" event={"ID":"8f904272-e411-4995-aacd-579497cea66e","Type":"ContainerStarted","Data":"0aeab7a58ec77226c5198c9bcd25d865300eb96600f88ea90046abb044358dda"} Apr 19 12:12:04.923298 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:04.923298 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-p8pqx" event={"ID":"8f904272-e411-4995-aacd-579497cea66e","Type":"ContainerStarted","Data":"8a91d181c325a027535e3a42fd03458fd8239a46b6fc783aec053dc5212563c1"} Apr 19 12:12:04.941081 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:04.941034 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-p8pqx" podStartSLOduration=3.011823463 podStartE2EDuration="3.941020774s" podCreationTimestamp="2026-04-19 12:12:01 +0000 UTC" firstStartedPulling="2026-04-19 12:12:02.284358729 +0000 UTC m=+136.351530597" lastFinishedPulling="2026-04-19 12:12:03.213556035 +0000 UTC m=+137.280727908" observedRunningTime="2026-04-19 12:12:04.939433107 +0000 UTC m=+139.006605025" watchObservedRunningTime="2026-04-19 12:12:04.941020774 +0000 UTC m=+139.008192665" Apr 19 12:12:14.862809 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:14.862772 2572 patch_prober.go:28] interesting pod/image-registry-56f9f776b7-q295c container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:14.863153 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:14.862821 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:16.894471 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:16.894438 2572 patch_prober.go:28] interesting pod/image-registry-56f9f776b7-q295c container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:16.894857 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:16.894489 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:20.792446 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:12:20.792406 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-447kt" podUID="f8b2bdcd-f07f-423f-a401-b0743ef1671f" Apr 19 12:12:20.803714 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:12:20.803686 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-h7xrv" podUID="105ea9fb-4797-44a9-a30b-6d58c96ae4d6" Apr 19 12:12:20.964448 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:20.964419 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-447kt" Apr 19 12:12:22.857656 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.857626 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-848554b7d7-5rtfd"] Apr 19 12:12:22.860722 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.860707 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.865087 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.864672 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-jfhvt\"" Apr 19 12:12:22.865087 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.864704 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 19 12:12:22.865087 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.864822 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 19 12:12:22.865087 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.864827 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 19 12:12:22.867743 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.865809 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 19 12:12:22.867743 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.866529 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 19 12:12:22.867743 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.866551 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 19 12:12:22.867743 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.866848 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 19 12:12:22.872583 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.872560 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-848554b7d7-5rtfd"] Apr 19 12:12:22.874608 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.874588 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 19 12:12:22.988636 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988616 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-service-ca\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.988722 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988641 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-oauth-config\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.988722 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988663 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-oauth-serving-cert\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.988722 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988705 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-serving-cert\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.988843 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988739 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-trusted-ca-bundle\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.988843 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988817 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqlbl\" (UniqueName: \"kubernetes.io/projected/ce63536c-616d-4276-b3a3-cfe318037bd0-kube-api-access-hqlbl\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:22.988905 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:22.988844 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-console-config\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.089764 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.089740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-trusted-ca-bundle\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.089917 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.089900 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqlbl\" (UniqueName: \"kubernetes.io/projected/ce63536c-616d-4276-b3a3-cfe318037bd0-kube-api-access-hqlbl\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.089988 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.089933 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-console-config\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.089988 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.089965 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-service-ca\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.090095 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.089993 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-oauth-config\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.090095 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.090030 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-oauth-serving-cert\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.090095 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.090066 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-serving-cert\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.090715 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.090687 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-trusted-ca-bundle\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.090850 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.090698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-oauth-serving-cert\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.090850 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.090698 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-service-ca\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.091264 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.091236 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-console-config\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.092743 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.092717 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-serving-cert\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.092854 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.092833 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-oauth-config\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.096939 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.096909 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqlbl\" (UniqueName: \"kubernetes.io/projected/ce63536c-616d-4276-b3a3-cfe318037bd0-kube-api-access-hqlbl\") pod \"console-848554b7d7-5rtfd\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.174852 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.174797 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:23.283307 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.283240 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-848554b7d7-5rtfd"] Apr 19 12:12:23.285675 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:12:23.285643 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podce63536c_616d_4276_b3a3_cfe318037bd0.slice/crio-97e2c278129757fd25d5d7f035313e8ddd16cb6f96651fd662ed83c5c57b4a17 WatchSource:0}: Error finding container 97e2c278129757fd25d5d7f035313e8ddd16cb6f96651fd662ed83c5c57b4a17: Status 404 returned error can't find the container with id 97e2c278129757fd25d5d7f035313e8ddd16cb6f96651fd662ed83c5c57b4a17 Apr 19 12:12:23.973982 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:23.973941 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848554b7d7-5rtfd" event={"ID":"ce63536c-616d-4276-b3a3-cfe318037bd0","Type":"ContainerStarted","Data":"97e2c278129757fd25d5d7f035313e8ddd16cb6f96651fd662ed83c5c57b4a17"} Apr 19 12:12:24.864369 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:24.864338 2572 patch_prober.go:28] interesting pod/image-registry-56f9f776b7-q295c container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:24.864562 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:24.864384 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:25.710781 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:25.710722 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:12:25.710781 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:25.710775 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:12:25.713353 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:25.713325 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8b2bdcd-f07f-423f-a401-b0743ef1671f-metrics-tls\") pod \"dns-default-447kt\" (UID: \"f8b2bdcd-f07f-423f-a401-b0743ef1671f\") " pod="openshift-dns/dns-default-447kt" Apr 19 12:12:25.713778 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:25.713737 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/105ea9fb-4797-44a9-a30b-6d58c96ae4d6-cert\") pod \"ingress-canary-h7xrv\" (UID: \"105ea9fb-4797-44a9-a30b-6d58c96ae4d6\") " pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:12:25.768441 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:25.768409 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-7hvqw\"" Apr 19 12:12:25.776474 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:25.776448 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-447kt" Apr 19 12:12:26.085533 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:26.085507 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-447kt"] Apr 19 12:12:26.088017 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:12:26.087993 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8b2bdcd_f07f_423f_a401_b0743ef1671f.slice/crio-e501415643d9a352a6270dfc1d9b34fdf59cf9d4863395718baf1c6b04c199d3 WatchSource:0}: Error finding container e501415643d9a352a6270dfc1d9b34fdf59cf9d4863395718baf1c6b04c199d3: Status 404 returned error can't find the container with id e501415643d9a352a6270dfc1d9b34fdf59cf9d4863395718baf1c6b04c199d3 Apr 19 12:12:26.893743 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:26.893703 2572 patch_prober.go:28] interesting pod/image-registry-56f9f776b7-q295c container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:26.894147 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:26.893790 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:26.982815 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:26.982780 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848554b7d7-5rtfd" event={"ID":"ce63536c-616d-4276-b3a3-cfe318037bd0","Type":"ContainerStarted","Data":"cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209"} Apr 19 12:12:26.983948 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:26.983921 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-447kt" event={"ID":"f8b2bdcd-f07f-423f-a401-b0743ef1671f","Type":"ContainerStarted","Data":"e501415643d9a352a6270dfc1d9b34fdf59cf9d4863395718baf1c6b04c199d3"} Apr 19 12:12:27.000008 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:26.999961 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-848554b7d7-5rtfd" podStartSLOduration=2.2529847800000002 podStartE2EDuration="4.999949617s" podCreationTimestamp="2026-04-19 12:12:22 +0000 UTC" firstStartedPulling="2026-04-19 12:12:23.287650592 +0000 UTC m=+157.354822462" lastFinishedPulling="2026-04-19 12:12:26.034615422 +0000 UTC m=+160.101787299" observedRunningTime="2026-04-19 12:12:26.997712541 +0000 UTC m=+161.064884432" watchObservedRunningTime="2026-04-19 12:12:26.999949617 +0000 UTC m=+161.067121508" Apr 19 12:12:27.988401 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:27.988364 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-447kt" event={"ID":"f8b2bdcd-f07f-423f-a401-b0743ef1671f","Type":"ContainerStarted","Data":"b424b88464291bea0d0b3133cb40a9a9e770c871e6dd1bc919d39f6e5324725e"} Apr 19 12:12:27.988401 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:27.988400 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-447kt" event={"ID":"f8b2bdcd-f07f-423f-a401-b0743ef1671f","Type":"ContainerStarted","Data":"c71d232a652820612d4787819e73a6c9db4d714ffbcd74d89416d64f35e90788"} Apr 19 12:12:27.988902 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:27.988623 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-447kt" Apr 19 12:12:28.004478 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:28.004434 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-447kt" podStartSLOduration=129.802611366 podStartE2EDuration="2m11.004423157s" podCreationTimestamp="2026-04-19 12:10:17 +0000 UTC" firstStartedPulling="2026-04-19 12:12:26.089778291 +0000 UTC m=+160.156950160" lastFinishedPulling="2026-04-19 12:12:27.291590073 +0000 UTC m=+161.358761951" observedRunningTime="2026-04-19 12:12:28.004228684 +0000 UTC m=+162.071400577" watchObservedRunningTime="2026-04-19 12:12:28.004423157 +0000 UTC m=+162.071595048" Apr 19 12:12:31.891975 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:31.891939 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" podUID="44839d59-49b3-4c2d-a204-f600cf1975bb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 12:12:33.175012 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:33.174973 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:33.175544 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:33.175046 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:33.176362 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:33.176336 2572 patch_prober.go:28] interesting pod/console-848554b7d7-5rtfd container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.133.0.12:8443/health\": dial tcp 10.133.0.12:8443: connect: connection refused" start-of-body= Apr 19 12:12:33.176446 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:33.176393 2572 prober.go:120] "Probe failed" probeType="Startup" pod="openshift-console/console-848554b7d7-5rtfd" podUID="ce63536c-616d-4276-b3a3-cfe318037bd0" containerName="console" probeResult="failure" output="Get \"https://10.133.0.12:8443/health\": dial tcp 10.133.0.12:8443: connect: connection refused" Apr 19 12:12:34.493095 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.493059 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:12:34.496275 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.496257 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-zfmpv\"" Apr 19 12:12:34.504253 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.504236 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-h7xrv" Apr 19 12:12:34.615329 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.615308 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-h7xrv"] Apr 19 12:12:34.617621 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:12:34.617585 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod105ea9fb_4797_44a9_a30b_6d58c96ae4d6.slice/crio-98db888ebf9233cf910627361faedbea64b4811696badba5f605f4ae95dece8c WatchSource:0}: Error finding container 98db888ebf9233cf910627361faedbea64b4811696badba5f605f4ae95dece8c: Status 404 returned error can't find the container with id 98db888ebf9233cf910627361faedbea64b4811696badba5f605f4ae95dece8c Apr 19 12:12:34.862094 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.862068 2572 patch_prober.go:28] interesting pod/image-registry-56f9f776b7-q295c container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:34.862217 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.862121 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:34.862217 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.862159 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:12:34.862576 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.862543 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="registry" containerStatusID={"Type":"cri-o","ID":"7ecdc5979fb63360201e9b51ecd014e121aeb0b19a30f0715263f96b841c9f68"} pod="openshift-image-registry/image-registry-56f9f776b7-q295c" containerMessage="Container registry failed liveness probe, will be restarted" Apr 19 12:12:34.865849 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.865826 2572 patch_prober.go:28] interesting pod/image-registry-56f9f776b7-q295c container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 19 12:12:34.865958 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:34.865861 2572 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 19 12:12:35.007680 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:35.007643 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h7xrv" event={"ID":"105ea9fb-4797-44a9-a30b-6d58c96ae4d6","Type":"ContainerStarted","Data":"98db888ebf9233cf910627361faedbea64b4811696badba5f605f4ae95dece8c"} Apr 19 12:12:37.013638 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:37.013603 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-h7xrv" event={"ID":"105ea9fb-4797-44a9-a30b-6d58c96ae4d6","Type":"ContainerStarted","Data":"b678fe72ad40196f69fafc86c4af77c6f18cdaa47489b425bd65b65ccfb91204"} Apr 19 12:12:37.028515 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:37.028473 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-h7xrv" podStartSLOduration=138.6076107 podStartE2EDuration="2m20.028460793s" podCreationTimestamp="2026-04-19 12:10:17 +0000 UTC" firstStartedPulling="2026-04-19 12:12:34.619355241 +0000 UTC m=+168.686527125" lastFinishedPulling="2026-04-19 12:12:36.040205338 +0000 UTC m=+170.107377218" observedRunningTime="2026-04-19 12:12:37.027243352 +0000 UTC m=+171.094415244" watchObservedRunningTime="2026-04-19 12:12:37.028460793 +0000 UTC m=+171.095632684" Apr 19 12:12:37.994422 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:37.994389 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-447kt" Apr 19 12:12:41.891830 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:41.891784 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" podUID="44839d59-49b3-4c2d-a204-f600cf1975bb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 12:12:43.178907 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:43.178876 2572 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:43.182690 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:43.182666 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:12:44.866193 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:44.866166 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:12:50.190973 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:50.190945 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-447kt_f8b2bdcd-f07f-423f-a401-b0743ef1671f/dns/0.log" Apr 19 12:12:50.391966 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:50.391938 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-447kt_f8b2bdcd-f07f-423f-a401-b0743ef1671f/kube-rbac-proxy/0.log" Apr 19 12:12:51.791323 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:51.791289 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m897s_d6bccaec-31af-4abb-8335-372cc94da827/dns-node-resolver/0.log" Apr 19 12:12:51.891963 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:51.891927 2572 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" podUID="44839d59-49b3-4c2d-a204-f600cf1975bb" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 19 12:12:51.892102 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:51.891996 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" Apr 19 12:12:51.892423 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:51.892406 2572 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"bc41bb2560021ea8e8d6eaba7888489c218f8831b3f2c77f91f78b8c4eb926ac"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 19 12:12:51.892467 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:51.892442 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" podUID="44839d59-49b3-4c2d-a204-f600cf1975bb" containerName="service-proxy" containerID="cri-o://bc41bb2560021ea8e8d6eaba7888489c218f8831b3f2c77f91f78b8c4eb926ac" gracePeriod=30 Apr 19 12:12:52.051480 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:52.051444 2572 generic.go:358] "Generic (PLEG): container finished" podID="44839d59-49b3-4c2d-a204-f600cf1975bb" containerID="bc41bb2560021ea8e8d6eaba7888489c218f8831b3f2c77f91f78b8c4eb926ac" exitCode=2 Apr 19 12:12:52.051628 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:52.051506 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" event={"ID":"44839d59-49b3-4c2d-a204-f600cf1975bb","Type":"ContainerDied","Data":"bc41bb2560021ea8e8d6eaba7888489c218f8831b3f2c77f91f78b8c4eb926ac"} Apr 19 12:12:52.391645 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:52.391560 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-h7xrv_105ea9fb-4797-44a9-a30b-6d58c96ae4d6/serve-healthcheck-canary/0.log" Apr 19 12:12:53.056027 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:53.055993 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-859549b7f4-k2mkh" event={"ID":"44839d59-49b3-4c2d-a204-f600cf1975bb","Type":"ContainerStarted","Data":"c8b9a39d7353537e4d4a9dcc66d6845a17669a1b889af1d61c5c1722c74c8d6f"} Apr 19 12:12:59.881442 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:12:59.881393 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" podUID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerName="registry" containerID="cri-o://7ecdc5979fb63360201e9b51ecd014e121aeb0b19a30f0715263f96b841c9f68" gracePeriod=30 Apr 19 12:13:01.080404 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:13:01.080379 2572 generic.go:358] "Generic (PLEG): container finished" podID="8a059cfb-bede-4767-80e4-ef7061d18b06" containerID="7ecdc5979fb63360201e9b51ecd014e121aeb0b19a30f0715263f96b841c9f68" exitCode=0 Apr 19 12:13:01.080716 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:13:01.080459 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" event={"ID":"8a059cfb-bede-4767-80e4-ef7061d18b06","Type":"ContainerDied","Data":"7ecdc5979fb63360201e9b51ecd014e121aeb0b19a30f0715263f96b841c9f68"} Apr 19 12:13:01.080716 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:13:01.080497 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" event={"ID":"8a059cfb-bede-4767-80e4-ef7061d18b06","Type":"ContainerStarted","Data":"ea4677d8885cf94ea1107bd37c83f39b52483e66e4807c7eceacd56295b4ca3c"} Apr 19 12:13:01.080716 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:13:01.080529 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:13:22.087707 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:13:22.087680 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-56f9f776b7-q295c" Apr 19 12:13:47.849151 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:13:47.849121 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-848554b7d7-5rtfd"] Apr 19 12:14:12.867456 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:12.867349 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-848554b7d7-5rtfd" podUID="ce63536c-616d-4276-b3a3-cfe318037bd0" containerName="console" containerID="cri-o://cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209" gracePeriod=15 Apr 19 12:14:13.099580 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.099560 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-848554b7d7-5rtfd_ce63536c-616d-4276-b3a3-cfe318037bd0/console/0.log" Apr 19 12:14:13.099687 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.099630 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:14:13.244630 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244547 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-oauth-config\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.244630 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244608 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-console-config\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.244630 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244639 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-oauth-serving-cert\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.244919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244656 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-serving-cert\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.244919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244682 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-service-ca\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.244919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244696 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-trusted-ca-bundle\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.244919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.244729 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hqlbl\" (UniqueName: \"kubernetes.io/projected/ce63536c-616d-4276-b3a3-cfe318037bd0-kube-api-access-hqlbl\") pod \"ce63536c-616d-4276-b3a3-cfe318037bd0\" (UID: \"ce63536c-616d-4276-b3a3-cfe318037bd0\") " Apr 19 12:14:13.245114 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.245097 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-console-config" (OuterVolumeSpecName: "console-config") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:14:13.245171 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.245108 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:14:13.245229 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.245206 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-service-ca" (OuterVolumeSpecName: "service-ca") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:14:13.245229 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.245213 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:14:13.246887 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.246855 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:14:13.246985 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.246927 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:14:13.247072 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.247052 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ce63536c-616d-4276-b3a3-cfe318037bd0-kube-api-access-hqlbl" (OuterVolumeSpecName: "kube-api-access-hqlbl") pod "ce63536c-616d-4276-b3a3-cfe318037bd0" (UID: "ce63536c-616d-4276-b3a3-cfe318037bd0"). InnerVolumeSpecName "kube-api-access-hqlbl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:14:13.270177 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.270153 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-848554b7d7-5rtfd_ce63536c-616d-4276-b3a3-cfe318037bd0/console/0.log" Apr 19 12:14:13.270280 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.270208 2572 generic.go:358] "Generic (PLEG): container finished" podID="ce63536c-616d-4276-b3a3-cfe318037bd0" containerID="cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209" exitCode=2 Apr 19 12:14:13.270280 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.270262 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-848554b7d7-5rtfd" Apr 19 12:14:13.270389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.270275 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848554b7d7-5rtfd" event={"ID":"ce63536c-616d-4276-b3a3-cfe318037bd0","Type":"ContainerDied","Data":"cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209"} Apr 19 12:14:13.270389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.270310 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-848554b7d7-5rtfd" event={"ID":"ce63536c-616d-4276-b3a3-cfe318037bd0","Type":"ContainerDied","Data":"97e2c278129757fd25d5d7f035313e8ddd16cb6f96651fd662ed83c5c57b4a17"} Apr 19 12:14:13.270389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.270331 2572 scope.go:117] "RemoveContainer" containerID="cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209" Apr 19 12:14:13.278378 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.278362 2572 scope.go:117] "RemoveContainer" containerID="cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209" Apr 19 12:14:13.278648 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:14:13.278630 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209\": container with ID starting with cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209 not found: ID does not exist" containerID="cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209" Apr 19 12:14:13.278694 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.278657 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209"} err="failed to get container status \"cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209\": rpc error: code = NotFound desc = could not find container \"cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209\": container with ID starting with cd650c67c40e89e5f60e328109cf71c84d6536548a5bf147a3a8d473c0cd6209 not found: ID does not exist" Apr 19 12:14:13.289643 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.289621 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-848554b7d7-5rtfd"] Apr 19 12:14:13.292644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.292625 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-848554b7d7-5rtfd"] Apr 19 12:14:13.345473 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345451 2572 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-console-config\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:13.345473 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345472 2572 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-oauth-serving-cert\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:13.345592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345482 2572 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-serving-cert\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:13.345592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345492 2572 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-service-ca\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:13.345592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345502 2572 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ce63536c-616d-4276-b3a3-cfe318037bd0-trusted-ca-bundle\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:13.345592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345510 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hqlbl\" (UniqueName: \"kubernetes.io/projected/ce63536c-616d-4276-b3a3-cfe318037bd0-kube-api-access-hqlbl\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:13.345592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:13.345519 2572 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ce63536c-616d-4276-b3a3-cfe318037bd0-console-oauth-config\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:14:14.497111 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:14.497075 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ce63536c-616d-4276-b3a3-cfe318037bd0" path="/var/lib/kubelet/pods/ce63536c-616d-4276-b3a3-cfe318037bd0/volumes" Apr 19 12:14:46.382672 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:14:46.382645 2572 kubelet.go:1628] "Image garbage collection succeeded" Apr 19 12:15:15.067805 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.067747 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-phh8k"] Apr 19 12:15:15.073066 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.068036 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ce63536c-616d-4276-b3a3-cfe318037bd0" containerName="console" Apr 19 12:15:15.073066 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.068048 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="ce63536c-616d-4276-b3a3-cfe318037bd0" containerName="console" Apr 19 12:15:15.073066 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.068100 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="ce63536c-616d-4276-b3a3-cfe318037bd0" containerName="console" Apr 19 12:15:15.073066 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.071966 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.075929 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.075899 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 19 12:15:15.076875 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.076853 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 19 12:15:15.077008 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.076874 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-nqdqg\"" Apr 19 12:15:15.079298 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.079242 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-phh8k"] Apr 19 12:15:15.151603 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.151575 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsmjk\" (UniqueName: \"kubernetes.io/projected/13084195-a6bd-4050-a80b-9ca310e40251-kube-api-access-fsmjk\") pod \"cert-manager-cainjector-8966b78d4-phh8k\" (UID: \"13084195-a6bd-4050-a80b-9ca310e40251\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.151706 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.151614 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13084195-a6bd-4050-a80b-9ca310e40251-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-phh8k\" (UID: \"13084195-a6bd-4050-a80b-9ca310e40251\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.252911 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.252879 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsmjk\" (UniqueName: \"kubernetes.io/projected/13084195-a6bd-4050-a80b-9ca310e40251-kube-api-access-fsmjk\") pod \"cert-manager-cainjector-8966b78d4-phh8k\" (UID: \"13084195-a6bd-4050-a80b-9ca310e40251\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.253008 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.252921 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13084195-a6bd-4050-a80b-9ca310e40251-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-phh8k\" (UID: \"13084195-a6bd-4050-a80b-9ca310e40251\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.260811 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.260782 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsmjk\" (UniqueName: \"kubernetes.io/projected/13084195-a6bd-4050-a80b-9ca310e40251-kube-api-access-fsmjk\") pod \"cert-manager-cainjector-8966b78d4-phh8k\" (UID: \"13084195-a6bd-4050-a80b-9ca310e40251\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.260899 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.260813 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/13084195-a6bd-4050-a80b-9ca310e40251-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-phh8k\" (UID: \"13084195-a6bd-4050-a80b-9ca310e40251\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.383729 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.383676 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" Apr 19 12:15:15.493447 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.493416 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-phh8k"] Apr 19 12:15:15.497475 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:15:15.497451 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13084195_a6bd_4050_a80b_9ca310e40251.slice/crio-bdd41362769b47bc1c8e5022eed54927c174c991375abdb0fb17bebb63245d25 WatchSource:0}: Error finding container bdd41362769b47bc1c8e5022eed54927c174c991375abdb0fb17bebb63245d25: Status 404 returned error can't find the container with id bdd41362769b47bc1c8e5022eed54927c174c991375abdb0fb17bebb63245d25 Apr 19 12:15:15.499280 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:15.499265 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:15:16.433429 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:16.433391 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" event={"ID":"13084195-a6bd-4050-a80b-9ca310e40251","Type":"ContainerStarted","Data":"bdd41362769b47bc1c8e5022eed54927c174c991375abdb0fb17bebb63245d25"} Apr 19 12:15:19.444385 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:19.444295 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" event={"ID":"13084195-a6bd-4050-a80b-9ca310e40251","Type":"ContainerStarted","Data":"8cdc1a54063f3b92b24abd299279cc70c9b450f5c38f9241cad6e28f20dd5e25"} Apr 19 12:15:19.460862 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:19.460811 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-phh8k" podStartSLOduration=0.836584754 podStartE2EDuration="4.460797403s" podCreationTimestamp="2026-04-19 12:15:15 +0000 UTC" firstStartedPulling="2026-04-19 12:15:15.499387904 +0000 UTC m=+329.566559772" lastFinishedPulling="2026-04-19 12:15:19.123600537 +0000 UTC m=+333.190772421" observedRunningTime="2026-04-19 12:15:19.460144137 +0000 UTC m=+333.527316028" watchObservedRunningTime="2026-04-19 12:15:19.460797403 +0000 UTC m=+333.527969293" Apr 19 12:15:38.349139 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.349063 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv"] Apr 19 12:15:38.352280 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.352263 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.354906 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.354883 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 19 12:15:38.355190 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.355173 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 19 12:15:38.355242 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.355202 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-66xtk\"" Apr 19 12:15:38.355242 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.355231 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 19 12:15:38.355330 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.355241 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 19 12:15:38.367865 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.367837 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv"] Apr 19 12:15:38.508476 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.508442 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15457b9f-fff7-48f5-ab7d-7e79bb792e93-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.508620 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.508485 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsmnd\" (UniqueName: \"kubernetes.io/projected/15457b9f-fff7-48f5-ab7d-7e79bb792e93-kube-api-access-lsmnd\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.508620 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.508516 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15457b9f-fff7-48f5-ab7d-7e79bb792e93-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.609510 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.609429 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15457b9f-fff7-48f5-ab7d-7e79bb792e93-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.609510 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.609476 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsmnd\" (UniqueName: \"kubernetes.io/projected/15457b9f-fff7-48f5-ab7d-7e79bb792e93-kube-api-access-lsmnd\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.609510 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.609508 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15457b9f-fff7-48f5-ab7d-7e79bb792e93-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.611901 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.611877 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/15457b9f-fff7-48f5-ab7d-7e79bb792e93-webhook-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.612025 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.612008 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/15457b9f-fff7-48f5-ab7d-7e79bb792e93-apiservice-cert\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.624880 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.624854 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsmnd\" (UniqueName: \"kubernetes.io/projected/15457b9f-fff7-48f5-ab7d-7e79bb792e93-kube-api-access-lsmnd\") pod \"opendatahub-operator-controller-manager-676bcb86f4-67clv\" (UID: \"15457b9f-fff7-48f5-ab7d-7e79bb792e93\") " pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.661657 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.661623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:38.785497 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:38.785461 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv"] Apr 19 12:15:38.788817 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:15:38.788795 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15457b9f_fff7_48f5_ab7d_7e79bb792e93.slice/crio-6dc3c162cd64683c4ded827f65d69649d1f7fc0b68bb963d9943595f020c2a14 WatchSource:0}: Error finding container 6dc3c162cd64683c4ded827f65d69649d1f7fc0b68bb963d9943595f020c2a14: Status 404 returned error can't find the container with id 6dc3c162cd64683c4ded827f65d69649d1f7fc0b68bb963d9943595f020c2a14 Apr 19 12:15:39.491890 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:39.491851 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" event={"ID":"15457b9f-fff7-48f5-ab7d-7e79bb792e93","Type":"ContainerStarted","Data":"6dc3c162cd64683c4ded827f65d69649d1f7fc0b68bb963d9943595f020c2a14"} Apr 19 12:15:42.500919 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:42.500816 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" event={"ID":"15457b9f-fff7-48f5-ab7d-7e79bb792e93","Type":"ContainerStarted","Data":"39d78e566910daffc9de70546aa280bf46a969ce0d17282dbd38c8fa95734bba"} Apr 19 12:15:42.501365 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:42.500963 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:42.521357 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:42.521311 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" podStartSLOduration=1.7176229840000001 podStartE2EDuration="4.521300311s" podCreationTimestamp="2026-04-19 12:15:38 +0000 UTC" firstStartedPulling="2026-04-19 12:15:38.790622442 +0000 UTC m=+352.857794314" lastFinishedPulling="2026-04-19 12:15:41.594299768 +0000 UTC m=+355.661471641" observedRunningTime="2026-04-19 12:15:42.519736894 +0000 UTC m=+356.586908786" watchObservedRunningTime="2026-04-19 12:15:42.521300311 +0000 UTC m=+356.588472202" Apr 19 12:15:53.506050 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:53.506016 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-676bcb86f4-67clv" Apr 19 12:15:55.935463 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.935425 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-58497579d8-6w699"] Apr 19 12:15:55.937647 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.937624 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:55.941247 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.941226 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-9qgz6\"" Apr 19 12:15:55.941247 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.941237 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 19 12:15:55.941413 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.941276 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 19 12:15:55.941573 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.941554 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 19 12:15:55.941680 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.941559 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 19 12:15:55.947375 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:55.947357 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-58497579d8-6w699"] Apr 19 12:15:56.027154 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.027128 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-tls-certs\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.027295 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.027163 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-tmp\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.027295 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.027193 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2bvw\" (UniqueName: \"kubernetes.io/projected/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-kube-api-access-j2bvw\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.127834 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.127802 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-tls-certs\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.127982 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.127841 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-tmp\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.127982 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.127888 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2bvw\" (UniqueName: \"kubernetes.io/projected/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-kube-api-access-j2bvw\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.130107 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.130079 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-tmp\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.130337 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.130319 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-tls-certs\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.138749 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.138724 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2bvw\" (UniqueName: \"kubernetes.io/projected/00f08ae1-8940-40d9-b49e-4c6be9ca6c92-kube-api-access-j2bvw\") pod \"kube-auth-proxy-58497579d8-6w699\" (UID: \"00f08ae1-8940-40d9-b49e-4c6be9ca6c92\") " pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.249397 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.249373 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" Apr 19 12:15:56.367380 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.367348 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-58497579d8-6w699"] Apr 19 12:15:56.370619 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:15:56.370592 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00f08ae1_8940_40d9_b49e_4c6be9ca6c92.slice/crio-9661c82459d827497c0310ae2157e113914ad72e3fd186eae6bf5dad0e63bbf3 WatchSource:0}: Error finding container 9661c82459d827497c0310ae2157e113914ad72e3fd186eae6bf5dad0e63bbf3: Status 404 returned error can't find the container with id 9661c82459d827497c0310ae2157e113914ad72e3fd186eae6bf5dad0e63bbf3 Apr 19 12:15:56.537028 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:56.536946 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" event={"ID":"00f08ae1-8940-40d9-b49e-4c6be9ca6c92","Type":"ContainerStarted","Data":"9661c82459d827497c0310ae2157e113914ad72e3fd186eae6bf5dad0e63bbf3"} Apr 19 12:15:59.174625 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.174584 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-v9pnf"] Apr 19 12:15:59.177426 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.177399 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.179754 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.179720 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-5svlc\"" Apr 19 12:15:59.179913 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.179855 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 19 12:15:59.184053 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.184032 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-v9pnf"] Apr 19 12:15:59.360800 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.360728 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sv7z\" (UniqueName: \"kubernetes.io/projected/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-kube-api-access-7sv7z\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.361030 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.360834 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-cert\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.462163 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.462069 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-cert\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.462328 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.462171 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sv7z\" (UniqueName: \"kubernetes.io/projected/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-kube-api-access-7sv7z\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.462328 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:15:59.462261 2572 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 19 12:15:59.462438 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:15:59.462336 2572 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-cert podName:43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047 nodeName:}" failed. No retries permitted until 2026-04-19 12:15:59.962312939 +0000 UTC m=+374.029484810 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-cert") pod "odh-model-controller-858dbf95b8-v9pnf" (UID: "43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047") : secret "odh-model-controller-webhook-cert" not found Apr 19 12:15:59.477149 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.477119 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sv7z\" (UniqueName: \"kubernetes.io/projected/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-kube-api-access-7sv7z\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.864623 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.864589 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd"] Apr 19 12:15:59.866436 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.866418 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:15:59.869304 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.869283 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 19 12:15:59.869433 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.869286 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 19 12:15:59.869624 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.869608 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 19 12:15:59.870458 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.870443 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 19 12:15:59.870544 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.870472 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 19 12:15:59.870544 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.870529 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-vtqjc\"" Apr 19 12:15:59.880812 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.880790 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd"] Apr 19 12:15:59.966054 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.966025 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-cert\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:15:59.966211 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.966063 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-cert\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:15:59.966211 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.966087 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-manager-config\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:15:59.966211 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.966153 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-metrics-cert\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:15:59.966211 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.966168 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5dgv\" (UniqueName: \"kubernetes.io/projected/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-kube-api-access-r5dgv\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:15:59.968421 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:15:59.968398 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047-cert\") pod \"odh-model-controller-858dbf95b8-v9pnf\" (UID: \"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047\") " pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:16:00.067288 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.067261 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-cert\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.067403 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.067296 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-manager-config\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.067403 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.067343 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-metrics-cert\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.067403 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.067369 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r5dgv\" (UniqueName: \"kubernetes.io/projected/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-kube-api-access-r5dgv\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.068063 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.068043 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-manager-config\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.069622 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.069600 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-cert\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.069718 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.069651 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-metrics-cert\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.074732 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.074700 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5dgv\" (UniqueName: \"kubernetes.io/projected/ee6e6443-453b-4c15-804a-8ce3d9fdd1e4-kube-api-access-r5dgv\") pod \"lws-controller-manager-688fc496d-xs6fd\" (UID: \"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4\") " pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.090498 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.090480 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:16:00.176351 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.176319 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:00.211064 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.210998 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-v9pnf"] Apr 19 12:16:00.213887 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:16:00.213845 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43b7f7a5_0f1e_4cb2_96ba_8fe84e6af047.slice/crio-170777ffc7e931f4c67093dab695f6ee785ab50baf83694d928289023530b06d WatchSource:0}: Error finding container 170777ffc7e931f4c67093dab695f6ee785ab50baf83694d928289023530b06d: Status 404 returned error can't find the container with id 170777ffc7e931f4c67093dab695f6ee785ab50baf83694d928289023530b06d Apr 19 12:16:00.292404 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.292339 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd"] Apr 19 12:16:00.295568 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:16:00.295533 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee6e6443_453b_4c15_804a_8ce3d9fdd1e4.slice/crio-6430bdfe526852db23badd1b614575a4bfd4d5b93258312965d4d42c2be1e8d5 WatchSource:0}: Error finding container 6430bdfe526852db23badd1b614575a4bfd4d5b93258312965d4d42c2be1e8d5: Status 404 returned error can't find the container with id 6430bdfe526852db23badd1b614575a4bfd4d5b93258312965d4d42c2be1e8d5 Apr 19 12:16:00.549750 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.549696 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" event={"ID":"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047","Type":"ContainerStarted","Data":"170777ffc7e931f4c67093dab695f6ee785ab50baf83694d928289023530b06d"} Apr 19 12:16:00.551059 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.551025 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" event={"ID":"00f08ae1-8940-40d9-b49e-4c6be9ca6c92","Type":"ContainerStarted","Data":"360113a461034d1e86f7f0acdffc370e22d03e12f125d252b3b3130920b0b334"} Apr 19 12:16:00.552148 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.552120 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" event={"ID":"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4","Type":"ContainerStarted","Data":"6430bdfe526852db23badd1b614575a4bfd4d5b93258312965d4d42c2be1e8d5"} Apr 19 12:16:00.566824 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:00.566779 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-58497579d8-6w699" podStartSLOduration=2.281148463 podStartE2EDuration="5.566751484s" podCreationTimestamp="2026-04-19 12:15:55 +0000 UTC" firstStartedPulling="2026-04-19 12:15:56.372150216 +0000 UTC m=+370.439322086" lastFinishedPulling="2026-04-19 12:15:59.657753238 +0000 UTC m=+373.724925107" observedRunningTime="2026-04-19 12:16:00.566272763 +0000 UTC m=+374.633444665" watchObservedRunningTime="2026-04-19 12:16:00.566751484 +0000 UTC m=+374.633923374" Apr 19 12:16:04.566631 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:04.566597 2572 generic.go:358] "Generic (PLEG): container finished" podID="43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047" containerID="4b38dd7da682c21aef6ce88b1fd75bb5436a2ae58e840a893c1c4cb74d6ca0ec" exitCode=1 Apr 19 12:16:04.567067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:04.566689 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" event={"ID":"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047","Type":"ContainerDied","Data":"4b38dd7da682c21aef6ce88b1fd75bb5436a2ae58e840a893c1c4cb74d6ca0ec"} Apr 19 12:16:04.567067 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:04.566890 2572 scope.go:117] "RemoveContainer" containerID="4b38dd7da682c21aef6ce88b1fd75bb5436a2ae58e840a893c1c4cb74d6ca0ec" Apr 19 12:16:04.568066 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:04.568047 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" event={"ID":"ee6e6443-453b-4c15-804a-8ce3d9fdd1e4","Type":"ContainerStarted","Data":"d444e61c381f01b9c2f4b188b8eae65e20a7c66359f46264e59d10928d31966e"} Apr 19 12:16:04.568187 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:04.568172 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:04.608957 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:04.608913 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" podStartSLOduration=1.979701206 podStartE2EDuration="5.608902913s" podCreationTimestamp="2026-04-19 12:15:59 +0000 UTC" firstStartedPulling="2026-04-19 12:16:00.298071559 +0000 UTC m=+374.365243432" lastFinishedPulling="2026-04-19 12:16:03.927273266 +0000 UTC m=+377.994445139" observedRunningTime="2026-04-19 12:16:04.608361258 +0000 UTC m=+378.675533152" watchObservedRunningTime="2026-04-19 12:16:04.608902913 +0000 UTC m=+378.676074806" Apr 19 12:16:05.572911 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:05.572877 2572 generic.go:358] "Generic (PLEG): container finished" podID="43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047" containerID="cc6ee67ade14252d7d06db95c61ec49b4286f0b22f9bfe7e072c411bfcee02ac" exitCode=1 Apr 19 12:16:05.573390 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:05.572963 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" event={"ID":"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047","Type":"ContainerDied","Data":"cc6ee67ade14252d7d06db95c61ec49b4286f0b22f9bfe7e072c411bfcee02ac"} Apr 19 12:16:05.573390 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:05.572995 2572 scope.go:117] "RemoveContainer" containerID="4b38dd7da682c21aef6ce88b1fd75bb5436a2ae58e840a893c1c4cb74d6ca0ec" Apr 19 12:16:05.573390 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:05.573215 2572 scope.go:117] "RemoveContainer" containerID="cc6ee67ade14252d7d06db95c61ec49b4286f0b22f9bfe7e072c411bfcee02ac" Apr 19 12:16:05.573531 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:16:05.573434 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-v9pnf_opendatahub(43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047)\"" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" podUID="43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047" Apr 19 12:16:06.577371 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:06.577333 2572 scope.go:117] "RemoveContainer" containerID="cc6ee67ade14252d7d06db95c61ec49b4286f0b22f9bfe7e072c411bfcee02ac" Apr 19 12:16:06.577725 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:16:06.577495 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-v9pnf_opendatahub(43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047)\"" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" podUID="43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047" Apr 19 12:16:10.091339 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:10.091301 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:16:10.091712 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:10.091700 2572 scope.go:117] "RemoveContainer" containerID="cc6ee67ade14252d7d06db95c61ec49b4286f0b22f9bfe7e072c411bfcee02ac" Apr 19 12:16:10.091914 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:16:10.091895 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-v9pnf_opendatahub(43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047)\"" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" podUID="43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047" Apr 19 12:16:15.575554 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:15.575509 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-688fc496d-xs6fd" Apr 19 12:16:20.090975 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:20.090931 2572 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:16:20.091368 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:20.091303 2572 scope.go:117] "RemoveContainer" containerID="cc6ee67ade14252d7d06db95c61ec49b4286f0b22f9bfe7e072c411bfcee02ac" Apr 19 12:16:20.615476 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:20.615442 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" event={"ID":"43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047","Type":"ContainerStarted","Data":"85a5ab957f38dd813713d40b8c930a38eb06b7b7e92ebc53ce1eb3b2b0fe6911"} Apr 19 12:16:20.615650 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:20.615560 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:16:20.630554 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:20.630510 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" podStartSLOduration=1.356594922 podStartE2EDuration="21.63049713s" podCreationTimestamp="2026-04-19 12:15:59 +0000 UTC" firstStartedPulling="2026-04-19 12:16:00.215408776 +0000 UTC m=+374.282580663" lastFinishedPulling="2026-04-19 12:16:20.489310999 +0000 UTC m=+394.556482871" observedRunningTime="2026-04-19 12:16:20.63023681 +0000 UTC m=+394.697408722" watchObservedRunningTime="2026-04-19 12:16:20.63049713 +0000 UTC m=+394.697669022" Apr 19 12:16:31.619675 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:16:31.619644 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-v9pnf" Apr 19 12:18:39.746111 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.746014 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4fslg"] Apr 19 12:18:39.749160 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.749135 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:39.751685 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.751663 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 19 12:18:39.752814 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.752791 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 19 12:18:39.752940 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.752791 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-tmbzq\"" Apr 19 12:18:39.755373 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.755346 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4fslg"] Apr 19 12:18:39.893123 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.893096 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvvct\" (UniqueName: \"kubernetes.io/projected/c151ab1d-91c0-4c1b-885d-f3945403087b-kube-api-access-qvvct\") pod \"authorino-f99f4b5cd-4fslg\" (UID: \"c151ab1d-91c0-4c1b-885d-f3945403087b\") " pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:39.948079 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.948052 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7498df8756-ggd7d"] Apr 19 12:18:39.951007 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.950992 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:18:39.956180 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.956156 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ggd7d"] Apr 19 12:18:39.993965 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:39.993940 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qvvct\" (UniqueName: \"kubernetes.io/projected/c151ab1d-91c0-4c1b-885d-f3945403087b-kube-api-access-qvvct\") pod \"authorino-f99f4b5cd-4fslg\" (UID: \"c151ab1d-91c0-4c1b-885d-f3945403087b\") " pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:40.005014 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.004991 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvvct\" (UniqueName: \"kubernetes.io/projected/c151ab1d-91c0-4c1b-885d-f3945403087b-kube-api-access-qvvct\") pod \"authorino-f99f4b5cd-4fslg\" (UID: \"c151ab1d-91c0-4c1b-885d-f3945403087b\") " pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:40.059806 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.059786 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:40.095381 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.095355 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcrd8\" (UniqueName: \"kubernetes.io/projected/144933c1-eeef-4864-89e8-c7fa92bcbb75-kube-api-access-zcrd8\") pod \"authorino-7498df8756-ggd7d\" (UID: \"144933c1-eeef-4864-89e8-c7fa92bcbb75\") " pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:18:40.169716 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.169688 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4fslg"] Apr 19 12:18:40.172925 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:18:40.172886 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc151ab1d_91c0_4c1b_885d_f3945403087b.slice/crio-a4595a0169660b5dc08bb516e6857a9eeff633d712de0423c2c5a3147f771109 WatchSource:0}: Error finding container a4595a0169660b5dc08bb516e6857a9eeff633d712de0423c2c5a3147f771109: Status 404 returned error can't find the container with id a4595a0169660b5dc08bb516e6857a9eeff633d712de0423c2c5a3147f771109 Apr 19 12:18:40.196774 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.196735 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcrd8\" (UniqueName: \"kubernetes.io/projected/144933c1-eeef-4864-89e8-c7fa92bcbb75-kube-api-access-zcrd8\") pod \"authorino-7498df8756-ggd7d\" (UID: \"144933c1-eeef-4864-89e8-c7fa92bcbb75\") " pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:18:40.203636 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.203614 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcrd8\" (UniqueName: \"kubernetes.io/projected/144933c1-eeef-4864-89e8-c7fa92bcbb75-kube-api-access-zcrd8\") pod \"authorino-7498df8756-ggd7d\" (UID: \"144933c1-eeef-4864-89e8-c7fa92bcbb75\") " pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:18:40.260162 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.260113 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:18:40.366624 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:40.366589 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7498df8756-ggd7d"] Apr 19 12:18:40.369352 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:18:40.369324 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod144933c1_eeef_4864_89e8_c7fa92bcbb75.slice/crio-8684287868c8074e80a87a84c1d7f4bb99cb0741dc0e64d69b5c803953dad267 WatchSource:0}: Error finding container 8684287868c8074e80a87a84c1d7f4bb99cb0741dc0e64d69b5c803953dad267: Status 404 returned error can't find the container with id 8684287868c8074e80a87a84c1d7f4bb99cb0741dc0e64d69b5c803953dad267 Apr 19 12:18:41.010774 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:41.010720 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ggd7d" event={"ID":"144933c1-eeef-4864-89e8-c7fa92bcbb75","Type":"ContainerStarted","Data":"8684287868c8074e80a87a84c1d7f4bb99cb0741dc0e64d69b5c803953dad267"} Apr 19 12:18:41.011931 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:41.011891 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" event={"ID":"c151ab1d-91c0-4c1b-885d-f3945403087b","Type":"ContainerStarted","Data":"a4595a0169660b5dc08bb516e6857a9eeff633d712de0423c2c5a3147f771109"} Apr 19 12:18:43.019717 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:43.019684 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" event={"ID":"c151ab1d-91c0-4c1b-885d-f3945403087b","Type":"ContainerStarted","Data":"ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17"} Apr 19 12:18:43.021035 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:43.021006 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ggd7d" event={"ID":"144933c1-eeef-4864-89e8-c7fa92bcbb75","Type":"ContainerStarted","Data":"1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b"} Apr 19 12:18:43.045647 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:43.045593 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" podStartSLOduration=1.611514075 podStartE2EDuration="4.045576396s" podCreationTimestamp="2026-04-19 12:18:39 +0000 UTC" firstStartedPulling="2026-04-19 12:18:40.174559633 +0000 UTC m=+534.241731507" lastFinishedPulling="2026-04-19 12:18:42.608621959 +0000 UTC m=+536.675793828" observedRunningTime="2026-04-19 12:18:43.032342516 +0000 UTC m=+537.099514406" watchObservedRunningTime="2026-04-19 12:18:43.045576396 +0000 UTC m=+537.112748288" Apr 19 12:18:43.045799 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:43.045674 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-7498df8756-ggd7d" podStartSLOduration=1.7978639360000002 podStartE2EDuration="4.045668572s" podCreationTimestamp="2026-04-19 12:18:39 +0000 UTC" firstStartedPulling="2026-04-19 12:18:40.370652351 +0000 UTC m=+534.437824220" lastFinishedPulling="2026-04-19 12:18:42.618456972 +0000 UTC m=+536.685628856" observedRunningTime="2026-04-19 12:18:43.044667974 +0000 UTC m=+537.111839866" watchObservedRunningTime="2026-04-19 12:18:43.045668572 +0000 UTC m=+537.112840463" Apr 19 12:18:43.067836 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:43.067814 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4fslg"] Apr 19 12:18:45.027301 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:45.027259 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" podUID="c151ab1d-91c0-4c1b-885d-f3945403087b" containerName="authorino" containerID="cri-o://ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17" gracePeriod=30 Apr 19 12:18:45.265190 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:45.265162 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:45.336322 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:45.336260 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qvvct\" (UniqueName: \"kubernetes.io/projected/c151ab1d-91c0-4c1b-885d-f3945403087b-kube-api-access-qvvct\") pod \"c151ab1d-91c0-4c1b-885d-f3945403087b\" (UID: \"c151ab1d-91c0-4c1b-885d-f3945403087b\") " Apr 19 12:18:45.338357 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:45.338326 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c151ab1d-91c0-4c1b-885d-f3945403087b-kube-api-access-qvvct" (OuterVolumeSpecName: "kube-api-access-qvvct") pod "c151ab1d-91c0-4c1b-885d-f3945403087b" (UID: "c151ab1d-91c0-4c1b-885d-f3945403087b"). InnerVolumeSpecName "kube-api-access-qvvct". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:18:45.437429 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:45.437393 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qvvct\" (UniqueName: \"kubernetes.io/projected/c151ab1d-91c0-4c1b-885d-f3945403087b-kube-api-access-qvvct\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:18:46.031133 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.031094 2572 generic.go:358] "Generic (PLEG): container finished" podID="c151ab1d-91c0-4c1b-885d-f3945403087b" containerID="ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17" exitCode=0 Apr 19 12:18:46.031640 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.031163 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" Apr 19 12:18:46.031640 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.031182 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" event={"ID":"c151ab1d-91c0-4c1b-885d-f3945403087b","Type":"ContainerDied","Data":"ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17"} Apr 19 12:18:46.031640 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.031221 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-4fslg" event={"ID":"c151ab1d-91c0-4c1b-885d-f3945403087b","Type":"ContainerDied","Data":"a4595a0169660b5dc08bb516e6857a9eeff633d712de0423c2c5a3147f771109"} Apr 19 12:18:46.031640 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.031239 2572 scope.go:117] "RemoveContainer" containerID="ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17" Apr 19 12:18:46.040692 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.040674 2572 scope.go:117] "RemoveContainer" containerID="ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17" Apr 19 12:18:46.041033 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:18:46.041006 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17\": container with ID starting with ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17 not found: ID does not exist" containerID="ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17" Apr 19 12:18:46.041131 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.041042 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17"} err="failed to get container status \"ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17\": rpc error: code = NotFound desc = could not find container \"ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17\": container with ID starting with ecb38925b7d9ec1f8d707b12e0a7654aa4f5a4247356bba0ec29679c378dee17 not found: ID does not exist" Apr 19 12:18:46.051460 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.051430 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4fslg"] Apr 19 12:18:46.055148 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.055125 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-4fslg"] Apr 19 12:18:46.497876 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:18:46.497848 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c151ab1d-91c0-4c1b-885d-f3945403087b" path="/var/lib/kubelet/pods/c151ab1d-91c0-4c1b-885d-f3945403087b/volumes" Apr 19 12:19:25.666855 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.666817 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-75t6v"] Apr 19 12:19:25.667266 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.667086 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c151ab1d-91c0-4c1b-885d-f3945403087b" containerName="authorino" Apr 19 12:19:25.667266 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.667100 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="c151ab1d-91c0-4c1b-885d-f3945403087b" containerName="authorino" Apr 19 12:19:25.667266 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.667157 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="c151ab1d-91c0-4c1b-885d-f3945403087b" containerName="authorino" Apr 19 12:19:25.668854 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.668838 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:25.676664 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.676640 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-75t6v"] Apr 19 12:19:25.815220 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.815177 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fck49\" (UniqueName: \"kubernetes.io/projected/08b4bcff-a289-4f02-ab20-7bb3fb584cf4-kube-api-access-fck49\") pod \"authorino-8b475cf9f-75t6v\" (UID: \"08b4bcff-a289-4f02-ab20-7bb3fb584cf4\") " pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:25.887986 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.887952 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-75t6v"] Apr 19 12:19:25.888138 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:19:25.888117 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-fck49], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-75t6v" podUID="08b4bcff-a289-4f02-ab20-7bb3fb584cf4" Apr 19 12:19:25.916268 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.916241 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fck49\" (UniqueName: \"kubernetes.io/projected/08b4bcff-a289-4f02-ab20-7bb3fb584cf4-kube-api-access-fck49\") pod \"authorino-8b475cf9f-75t6v\" (UID: \"08b4bcff-a289-4f02-ab20-7bb3fb584cf4\") " pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:25.931124 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.931064 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-rmkkr"] Apr 19 12:19:25.933028 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.933010 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:25.934739 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.934716 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fck49\" (UniqueName: \"kubernetes.io/projected/08b4bcff-a289-4f02-ab20-7bb3fb584cf4-kube-api-access-fck49\") pod \"authorino-8b475cf9f-75t6v\" (UID: \"08b4bcff-a289-4f02-ab20-7bb3fb584cf4\") " pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:25.953400 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:25.953378 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-rmkkr"] Apr 19 12:19:26.110864 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.110838 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-rmkkr"] Apr 19 12:19:26.111027 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:19:26.111005 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-zbqvk openshift-service-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" podUID="55afd936-7d07-4142-9244-8b384a0f0226" Apr 19 12:19:26.117786 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.117767 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55afd936-7d07-4142-9244-8b384a0f0226-openshift-service-ca\") pod \"authorino-7d79b7c9c5-rmkkr\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.117850 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.117797 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbqvk\" (UniqueName: \"kubernetes.io/projected/55afd936-7d07-4142-9244-8b384a0f0226-kube-api-access-zbqvk\") pod \"authorino-7d79b7c9c5-rmkkr\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.139191 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.139161 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.139292 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.139163 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:26.142735 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.142715 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-65675b9fc9-twgfg"] Apr 19 12:19:26.143679 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.143661 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.144895 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.144877 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.146954 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.146926 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:26.147116 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.147098 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 19 12:19:26.152241 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.152195 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-65675b9fc9-twgfg"] Apr 19 12:19:26.218253 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.218203 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55afd936-7d07-4142-9244-8b384a0f0226-openshift-service-ca\") pod \"authorino-7d79b7c9c5-rmkkr\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.218253 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.218233 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zbqvk\" (UniqueName: \"kubernetes.io/projected/55afd936-7d07-4142-9244-8b384a0f0226-kube-api-access-zbqvk\") pod \"authorino-7d79b7c9c5-rmkkr\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.218791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.218752 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55afd936-7d07-4142-9244-8b384a0f0226-openshift-service-ca\") pod \"authorino-7d79b7c9c5-rmkkr\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.225527 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.225503 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbqvk\" (UniqueName: \"kubernetes.io/projected/55afd936-7d07-4142-9244-8b384a0f0226-kube-api-access-zbqvk\") pod \"authorino-7d79b7c9c5-rmkkr\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:26.319404 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319375 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55afd936-7d07-4142-9244-8b384a0f0226-openshift-service-ca\") pod \"55afd936-7d07-4142-9244-8b384a0f0226\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " Apr 19 12:19:26.319492 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319421 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbqvk\" (UniqueName: \"kubernetes.io/projected/55afd936-7d07-4142-9244-8b384a0f0226-kube-api-access-zbqvk\") pod \"55afd936-7d07-4142-9244-8b384a0f0226\" (UID: \"55afd936-7d07-4142-9244-8b384a0f0226\") " Apr 19 12:19:26.319492 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319446 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fck49\" (UniqueName: \"kubernetes.io/projected/08b4bcff-a289-4f02-ab20-7bb3fb584cf4-kube-api-access-fck49\") pod \"08b4bcff-a289-4f02-ab20-7bb3fb584cf4\" (UID: \"08b4bcff-a289-4f02-ab20-7bb3fb584cf4\") " Apr 19 12:19:26.319579 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319554 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/12995c5b-d230-4480-a9e6-0f9121efb8c8-tls-cert\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.319579 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319574 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/12995c5b-d230-4480-a9e6-0f9121efb8c8-openshift-service-ca\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.319665 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319601 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qx9hv\" (UniqueName: \"kubernetes.io/projected/12995c5b-d230-4480-a9e6-0f9121efb8c8-kube-api-access-qx9hv\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.319768 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.319735 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55afd936-7d07-4142-9244-8b384a0f0226-openshift-service-ca" (OuterVolumeSpecName: "openshift-service-ca") pod "55afd936-7d07-4142-9244-8b384a0f0226" (UID: "55afd936-7d07-4142-9244-8b384a0f0226"). InnerVolumeSpecName "openshift-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:19:26.321312 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.321292 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55afd936-7d07-4142-9244-8b384a0f0226-kube-api-access-zbqvk" (OuterVolumeSpecName: "kube-api-access-zbqvk") pod "55afd936-7d07-4142-9244-8b384a0f0226" (UID: "55afd936-7d07-4142-9244-8b384a0f0226"). InnerVolumeSpecName "kube-api-access-zbqvk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:19:26.321517 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.321491 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/08b4bcff-a289-4f02-ab20-7bb3fb584cf4-kube-api-access-fck49" (OuterVolumeSpecName: "kube-api-access-fck49") pod "08b4bcff-a289-4f02-ab20-7bb3fb584cf4" (UID: "08b4bcff-a289-4f02-ab20-7bb3fb584cf4"). InnerVolumeSpecName "kube-api-access-fck49". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:19:26.420673 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.420649 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/12995c5b-d230-4480-a9e6-0f9121efb8c8-tls-cert\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.420791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.420678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/12995c5b-d230-4480-a9e6-0f9121efb8c8-openshift-service-ca\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.420791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.420703 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qx9hv\" (UniqueName: \"kubernetes.io/projected/12995c5b-d230-4480-a9e6-0f9121efb8c8-kube-api-access-qx9hv\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.420791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.420733 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/55afd936-7d07-4142-9244-8b384a0f0226-openshift-service-ca\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:19:26.420791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.420743 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zbqvk\" (UniqueName: \"kubernetes.io/projected/55afd936-7d07-4142-9244-8b384a0f0226-kube-api-access-zbqvk\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:19:26.420791 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.420753 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fck49\" (UniqueName: \"kubernetes.io/projected/08b4bcff-a289-4f02-ab20-7bb3fb584cf4-kube-api-access-fck49\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:19:26.421327 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.421307 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/12995c5b-d230-4480-a9e6-0f9121efb8c8-openshift-service-ca\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.422891 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.422860 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/12995c5b-d230-4480-a9e6-0f9121efb8c8-tls-cert\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.428401 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.428378 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qx9hv\" (UniqueName: \"kubernetes.io/projected/12995c5b-d230-4480-a9e6-0f9121efb8c8-kube-api-access-qx9hv\") pod \"authorino-65675b9fc9-twgfg\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.454685 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.454666 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:19:26.571562 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:26.571528 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-65675b9fc9-twgfg"] Apr 19 12:19:26.573961 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:19:26.573933 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod12995c5b_d230_4480_a9e6_0f9121efb8c8.slice/crio-46ff2f73d954cb34088224c52e8d2d3ace6e2ac67322cd7d24eb2a2318bf9659 WatchSource:0}: Error finding container 46ff2f73d954cb34088224c52e8d2d3ace6e2ac67322cd7d24eb2a2318bf9659: Status 404 returned error can't find the container with id 46ff2f73d954cb34088224c52e8d2d3ace6e2ac67322cd7d24eb2a2318bf9659 Apr 19 12:19:27.143455 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.143371 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-75t6v" Apr 19 12:19:27.143455 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.143393 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65675b9fc9-twgfg" event={"ID":"12995c5b-d230-4480-a9e6-0f9121efb8c8","Type":"ContainerStarted","Data":"f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98"} Apr 19 12:19:27.143455 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.143435 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65675b9fc9-twgfg" event={"ID":"12995c5b-d230-4480-a9e6-0f9121efb8c8","Type":"ContainerStarted","Data":"46ff2f73d954cb34088224c52e8d2d3ace6e2ac67322cd7d24eb2a2318bf9659"} Apr 19 12:19:27.143993 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.143616 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7d79b7c9c5-rmkkr" Apr 19 12:19:27.157584 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.157536 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-65675b9fc9-twgfg" podStartSLOduration=0.915616221 podStartE2EDuration="1.157524352s" podCreationTimestamp="2026-04-19 12:19:26 +0000 UTC" firstStartedPulling="2026-04-19 12:19:26.575074276 +0000 UTC m=+580.642246144" lastFinishedPulling="2026-04-19 12:19:26.816982403 +0000 UTC m=+580.884154275" observedRunningTime="2026-04-19 12:19:27.156720472 +0000 UTC m=+581.223892367" watchObservedRunningTime="2026-04-19 12:19:27.157524352 +0000 UTC m=+581.224696244" Apr 19 12:19:27.182206 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.182180 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ggd7d"] Apr 19 12:19:27.182371 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.182338 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-7498df8756-ggd7d" podUID="144933c1-eeef-4864-89e8-c7fa92bcbb75" containerName="authorino" containerID="cri-o://1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b" gracePeriod=30 Apr 19 12:19:27.191558 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.191515 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-75t6v"] Apr 19 12:19:27.195495 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.195422 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-75t6v"] Apr 19 12:19:27.213435 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.213410 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-rmkkr"] Apr 19 12:19:27.218015 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.217992 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7d79b7c9c5-rmkkr"] Apr 19 12:19:27.401940 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.401887 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:19:27.529057 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.529030 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcrd8\" (UniqueName: \"kubernetes.io/projected/144933c1-eeef-4864-89e8-c7fa92bcbb75-kube-api-access-zcrd8\") pod \"144933c1-eeef-4864-89e8-c7fa92bcbb75\" (UID: \"144933c1-eeef-4864-89e8-c7fa92bcbb75\") " Apr 19 12:19:27.531145 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.531118 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/144933c1-eeef-4864-89e8-c7fa92bcbb75-kube-api-access-zcrd8" (OuterVolumeSpecName: "kube-api-access-zcrd8") pod "144933c1-eeef-4864-89e8-c7fa92bcbb75" (UID: "144933c1-eeef-4864-89e8-c7fa92bcbb75"). InnerVolumeSpecName "kube-api-access-zcrd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:19:27.629577 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:27.629542 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zcrd8\" (UniqueName: \"kubernetes.io/projected/144933c1-eeef-4864-89e8-c7fa92bcbb75-kube-api-access-zcrd8\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:19:28.147358 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.147328 2572 generic.go:358] "Generic (PLEG): container finished" podID="144933c1-eeef-4864-89e8-c7fa92bcbb75" containerID="1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b" exitCode=0 Apr 19 12:19:28.147822 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.147372 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-7498df8756-ggd7d" Apr 19 12:19:28.147822 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.147398 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ggd7d" event={"ID":"144933c1-eeef-4864-89e8-c7fa92bcbb75","Type":"ContainerDied","Data":"1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b"} Apr 19 12:19:28.147822 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.147439 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-7498df8756-ggd7d" event={"ID":"144933c1-eeef-4864-89e8-c7fa92bcbb75","Type":"ContainerDied","Data":"8684287868c8074e80a87a84c1d7f4bb99cb0741dc0e64d69b5c803953dad267"} Apr 19 12:19:28.147822 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.147456 2572 scope.go:117] "RemoveContainer" containerID="1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b" Apr 19 12:19:28.160550 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.160529 2572 scope.go:117] "RemoveContainer" containerID="1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b" Apr 19 12:19:28.160834 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:19:28.160805 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b\": container with ID starting with 1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b not found: ID does not exist" containerID="1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b" Apr 19 12:19:28.160933 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.160839 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b"} err="failed to get container status \"1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b\": rpc error: code = NotFound desc = could not find container \"1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b\": container with ID starting with 1ff7bd9d6a5fb94a089cdc9dcf8091589a21dedb97384df07626c26b9278f31b not found: ID does not exist" Apr 19 12:19:28.170023 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.170004 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-7498df8756-ggd7d"] Apr 19 12:19:28.173799 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.173780 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-7498df8756-ggd7d"] Apr 19 12:19:28.497553 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.497489 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="08b4bcff-a289-4f02-ab20-7bb3fb584cf4" path="/var/lib/kubelet/pods/08b4bcff-a289-4f02-ab20-7bb3fb584cf4/volumes" Apr 19 12:19:28.497698 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.497687 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="144933c1-eeef-4864-89e8-c7fa92bcbb75" path="/var/lib/kubelet/pods/144933c1-eeef-4864-89e8-c7fa92bcbb75/volumes" Apr 19 12:19:28.498019 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:19:28.498007 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55afd936-7d07-4142-9244-8b384a0f0226" path="/var/lib/kubelet/pods/55afd936-7d07-4142-9244-8b384a0f0226/volumes" Apr 19 12:21:42.969160 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.969086 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-759f9fb8f4-89n2w"] Apr 19 12:21:42.969549 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.969373 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="144933c1-eeef-4864-89e8-c7fa92bcbb75" containerName="authorino" Apr 19 12:21:42.969549 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.969387 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="144933c1-eeef-4864-89e8-c7fa92bcbb75" containerName="authorino" Apr 19 12:21:42.969549 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.969445 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="144933c1-eeef-4864-89e8-c7fa92bcbb75" containerName="authorino" Apr 19 12:21:42.972153 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.972138 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:42.974538 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.974517 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/62fe26ea-15ad-4eea-8880-6a2db0a245e8-tls-cert\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:42.974643 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.974557 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnkgm\" (UniqueName: \"kubernetes.io/projected/62fe26ea-15ad-4eea-8880-6a2db0a245e8-kube-api-access-dnkgm\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:42.974643 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.974594 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/62fe26ea-15ad-4eea-8880-6a2db0a245e8-openshift-service-ca\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:42.980477 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:42.980449 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-759f9fb8f4-89n2w"] Apr 19 12:21:43.075445 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.075418 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnkgm\" (UniqueName: \"kubernetes.io/projected/62fe26ea-15ad-4eea-8880-6a2db0a245e8-kube-api-access-dnkgm\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.075561 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.075469 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/62fe26ea-15ad-4eea-8880-6a2db0a245e8-openshift-service-ca\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.075628 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.075573 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/62fe26ea-15ad-4eea-8880-6a2db0a245e8-tls-cert\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.076246 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.076220 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/62fe26ea-15ad-4eea-8880-6a2db0a245e8-openshift-service-ca\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.077991 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.077963 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/62fe26ea-15ad-4eea-8880-6a2db0a245e8-tls-cert\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.082514 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.082493 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnkgm\" (UniqueName: \"kubernetes.io/projected/62fe26ea-15ad-4eea-8880-6a2db0a245e8-kube-api-access-dnkgm\") pod \"authorino-759f9fb8f4-89n2w\" (UID: \"62fe26ea-15ad-4eea-8880-6a2db0a245e8\") " pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.281804 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.281781 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-759f9fb8f4-89n2w" Apr 19 12:21:43.395728 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.395699 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-759f9fb8f4-89n2w"] Apr 19 12:21:43.400510 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:21:43.400486 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod62fe26ea_15ad_4eea_8880_6a2db0a245e8.slice/crio-2326c60c9c34e364306c84ba91473aca92ccf45488c36012dfba60d834970b9f WatchSource:0}: Error finding container 2326c60c9c34e364306c84ba91473aca92ccf45488c36012dfba60d834970b9f: Status 404 returned error can't find the container with id 2326c60c9c34e364306c84ba91473aca92ccf45488c36012dfba60d834970b9f Apr 19 12:21:43.401664 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.401648 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:21:43.514558 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:43.514527 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-759f9fb8f4-89n2w" event={"ID":"62fe26ea-15ad-4eea-8880-6a2db0a245e8","Type":"ContainerStarted","Data":"2326c60c9c34e364306c84ba91473aca92ccf45488c36012dfba60d834970b9f"} Apr 19 12:21:44.518592 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.518559 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-759f9fb8f4-89n2w" event={"ID":"62fe26ea-15ad-4eea-8880-6a2db0a245e8","Type":"ContainerStarted","Data":"d68b901cfbe211e72d70a05b1b95903ea4677cdcdc2875495921cac491342e31"} Apr 19 12:21:44.534258 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.534215 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-759f9fb8f4-89n2w" podStartSLOduration=1.975323971 podStartE2EDuration="2.534202724s" podCreationTimestamp="2026-04-19 12:21:42 +0000 UTC" firstStartedPulling="2026-04-19 12:21:43.401790196 +0000 UTC m=+717.468962067" lastFinishedPulling="2026-04-19 12:21:43.960668948 +0000 UTC m=+718.027840820" observedRunningTime="2026-04-19 12:21:44.532679419 +0000 UTC m=+718.599851309" watchObservedRunningTime="2026-04-19 12:21:44.534202724 +0000 UTC m=+718.601374615" Apr 19 12:21:44.557575 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.557545 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65675b9fc9-twgfg"] Apr 19 12:21:44.557737 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.557718 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-65675b9fc9-twgfg" podUID="12995c5b-d230-4480-a9e6-0f9121efb8c8" containerName="authorino" containerID="cri-o://f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98" gracePeriod=30 Apr 19 12:21:44.792197 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.792177 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:21:44.887831 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.887802 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qx9hv\" (UniqueName: \"kubernetes.io/projected/12995c5b-d230-4480-a9e6-0f9121efb8c8-kube-api-access-qx9hv\") pod \"12995c5b-d230-4480-a9e6-0f9121efb8c8\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " Apr 19 12:21:44.888003 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.887877 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/12995c5b-d230-4480-a9e6-0f9121efb8c8-tls-cert\") pod \"12995c5b-d230-4480-a9e6-0f9121efb8c8\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " Apr 19 12:21:44.888003 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.887906 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/12995c5b-d230-4480-a9e6-0f9121efb8c8-openshift-service-ca\") pod \"12995c5b-d230-4480-a9e6-0f9121efb8c8\" (UID: \"12995c5b-d230-4480-a9e6-0f9121efb8c8\") " Apr 19 12:21:44.888391 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.888360 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/12995c5b-d230-4480-a9e6-0f9121efb8c8-openshift-service-ca" (OuterVolumeSpecName: "openshift-service-ca") pod "12995c5b-d230-4480-a9e6-0f9121efb8c8" (UID: "12995c5b-d230-4480-a9e6-0f9121efb8c8"). InnerVolumeSpecName "openshift-service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 19 12:21:44.889995 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.889961 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/12995c5b-d230-4480-a9e6-0f9121efb8c8-kube-api-access-qx9hv" (OuterVolumeSpecName: "kube-api-access-qx9hv") pod "12995c5b-d230-4480-a9e6-0f9121efb8c8" (UID: "12995c5b-d230-4480-a9e6-0f9121efb8c8"). InnerVolumeSpecName "kube-api-access-qx9hv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:21:44.896904 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.896881 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/12995c5b-d230-4480-a9e6-0f9121efb8c8-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "12995c5b-d230-4480-a9e6-0f9121efb8c8" (UID: "12995c5b-d230-4480-a9e6-0f9121efb8c8"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 19 12:21:44.989020 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.988997 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qx9hv\" (UniqueName: \"kubernetes.io/projected/12995c5b-d230-4480-a9e6-0f9121efb8c8-kube-api-access-qx9hv\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:21:44.989020 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.989019 2572 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/12995c5b-d230-4480-a9e6-0f9121efb8c8-tls-cert\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:21:44.989139 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:44.989030 2572 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/12995c5b-d230-4480-a9e6-0f9121efb8c8-openshift-service-ca\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:21:45.522004 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.521970 2572 generic.go:358] "Generic (PLEG): container finished" podID="12995c5b-d230-4480-a9e6-0f9121efb8c8" containerID="f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98" exitCode=0 Apr 19 12:21:45.522389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.522041 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-65675b9fc9-twgfg" Apr 19 12:21:45.522389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.522055 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65675b9fc9-twgfg" event={"ID":"12995c5b-d230-4480-a9e6-0f9121efb8c8","Type":"ContainerDied","Data":"f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98"} Apr 19 12:21:45.522389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.522093 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-65675b9fc9-twgfg" event={"ID":"12995c5b-d230-4480-a9e6-0f9121efb8c8","Type":"ContainerDied","Data":"46ff2f73d954cb34088224c52e8d2d3ace6e2ac67322cd7d24eb2a2318bf9659"} Apr 19 12:21:45.522389 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.522114 2572 scope.go:117] "RemoveContainer" containerID="f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98" Apr 19 12:21:45.530012 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.529992 2572 scope.go:117] "RemoveContainer" containerID="f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98" Apr 19 12:21:45.530247 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:21:45.530230 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98\": container with ID starting with f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98 not found: ID does not exist" containerID="f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98" Apr 19 12:21:45.530304 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.530255 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98"} err="failed to get container status \"f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98\": rpc error: code = NotFound desc = could not find container \"f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98\": container with ID starting with f6718d9f67ed97583f745fd60c0c12aa5c002b97c07645b06616638e4869cf98 not found: ID does not exist" Apr 19 12:21:45.541690 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.541671 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-65675b9fc9-twgfg"] Apr 19 12:21:45.549739 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:45.549716 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-65675b9fc9-twgfg"] Apr 19 12:21:46.496711 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:21:46.496677 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="12995c5b-d230-4480-a9e6-0f9121efb8c8" path="/var/lib/kubelet/pods/12995c5b-d230-4480-a9e6-0f9121efb8c8/volumes" Apr 19 12:23:09.906102 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.906026 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-54dc75dff7-kzq4l"] Apr 19 12:23:09.906570 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.906310 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="12995c5b-d230-4480-a9e6-0f9121efb8c8" containerName="authorino" Apr 19 12:23:09.906570 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.906320 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="12995c5b-d230-4480-a9e6-0f9121efb8c8" containerName="authorino" Apr 19 12:23:09.906570 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.906372 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="12995c5b-d230-4480-a9e6-0f9121efb8c8" containerName="authorino" Apr 19 12:23:09.909010 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.908993 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:23:09.911385 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.911363 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-t49hl\"" Apr 19 12:23:09.916103 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.916073 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-kzq4l"] Apr 19 12:23:09.967803 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:09.967753 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8njh\" (UniqueName: \"kubernetes.io/projected/18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d-kube-api-access-r8njh\") pod \"maas-controller-54dc75dff7-kzq4l\" (UID: \"18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d\") " pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:23:10.068474 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:10.068450 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8njh\" (UniqueName: \"kubernetes.io/projected/18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d-kube-api-access-r8njh\") pod \"maas-controller-54dc75dff7-kzq4l\" (UID: \"18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d\") " pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:23:10.075826 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:10.075798 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8njh\" (UniqueName: \"kubernetes.io/projected/18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d-kube-api-access-r8njh\") pod \"maas-controller-54dc75dff7-kzq4l\" (UID: \"18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d\") " pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:23:10.219657 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:10.219585 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:23:10.332708 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:10.332687 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-54dc75dff7-kzq4l"] Apr 19 12:23:10.335229 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:23:10.335190 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod18ee9c5d_eed7_4d50_84a8_6bb5b4207c4d.slice/crio-6fdbd8fbe0e3f30633472cce13dff2ed01667926f2ab4be3dd6229c8122fbb4a WatchSource:0}: Error finding container 6fdbd8fbe0e3f30633472cce13dff2ed01667926f2ab4be3dd6229c8122fbb4a: Status 404 returned error can't find the container with id 6fdbd8fbe0e3f30633472cce13dff2ed01667926f2ab4be3dd6229c8122fbb4a Apr 19 12:23:10.748650 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:10.748620 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" event={"ID":"18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d","Type":"ContainerStarted","Data":"6fdbd8fbe0e3f30633472cce13dff2ed01667926f2ab4be3dd6229c8122fbb4a"} Apr 19 12:23:12.757648 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:12.757605 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" event={"ID":"18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d","Type":"ContainerStarted","Data":"33b5bb5e05e4261a53b8fd250dfb3990b3e863cd6c55398ccc67bbb3e6a2816b"} Apr 19 12:23:12.758001 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:12.757731 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:23:12.772415 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:12.772376 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" podStartSLOduration=1.4549191000000001 podStartE2EDuration="3.772363591s" podCreationTimestamp="2026-04-19 12:23:09 +0000 UTC" firstStartedPulling="2026-04-19 12:23:10.336545643 +0000 UTC m=+804.403717512" lastFinishedPulling="2026-04-19 12:23:12.653990135 +0000 UTC m=+806.721162003" observedRunningTime="2026-04-19 12:23:12.77084147 +0000 UTC m=+806.838013361" watchObservedRunningTime="2026-04-19 12:23:12.772363591 +0000 UTC m=+806.839535482" Apr 19 12:23:23.765235 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:23:23.765206 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-54dc75dff7-kzq4l" Apr 19 12:30:00.132614 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.132578 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-key-cleanup-29610030-mxffw"] Apr 19 12:30:00.135923 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.135908 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:30:00.137109 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.137090 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzvgk\" (UniqueName: \"kubernetes.io/projected/8d1d6812-54b9-41a6-9b65-a84d02b6dd15-kube-api-access-lzvgk\") pod \"maas-api-key-cleanup-29610030-mxffw\" (UID: \"8d1d6812-54b9-41a6-9b65-a84d02b6dd15\") " pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:30:00.138467 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.138442 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-7sgxc\"" Apr 19 12:30:00.151991 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.151968 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610030-mxffw"] Apr 19 12:30:00.237810 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.237785 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzvgk\" (UniqueName: \"kubernetes.io/projected/8d1d6812-54b9-41a6-9b65-a84d02b6dd15-kube-api-access-lzvgk\") pod \"maas-api-key-cleanup-29610030-mxffw\" (UID: \"8d1d6812-54b9-41a6-9b65-a84d02b6dd15\") " pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:30:00.244890 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.244873 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzvgk\" (UniqueName: \"kubernetes.io/projected/8d1d6812-54b9-41a6-9b65-a84d02b6dd15-kube-api-access-lzvgk\") pod \"maas-api-key-cleanup-29610030-mxffw\" (UID: \"8d1d6812-54b9-41a6-9b65-a84d02b6dd15\") " pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:30:00.446044 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.445984 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:30:00.561275 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.561251 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610030-mxffw"] Apr 19 12:30:00.564010 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:30:00.563984 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d1d6812_54b9_41a6_9b65_a84d02b6dd15.slice/crio-2da21aff93c2d804a41ef797d115f615abe00ff6ce90d7a646dc1d2b28510f4a WatchSource:0}: Error finding container 2da21aff93c2d804a41ef797d115f615abe00ff6ce90d7a646dc1d2b28510f4a: Status 404 returned error can't find the container with id 2da21aff93c2d804a41ef797d115f615abe00ff6ce90d7a646dc1d2b28510f4a Apr 19 12:30:00.565726 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.565709 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:30:00.875087 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:00.875055 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerStarted","Data":"2da21aff93c2d804a41ef797d115f615abe00ff6ce90d7a646dc1d2b28510f4a"} Apr 19 12:30:01.879357 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:01.879320 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerStarted","Data":"f976b0907301863f79a5a8424d000d4d6f6b780cd94a7c3417a5d8518549ee15"} Apr 19 12:30:01.894893 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:01.894840 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" podStartSLOduration=1.451445699 podStartE2EDuration="1.894823944s" podCreationTimestamp="2026-04-19 12:30:00 +0000 UTC" firstStartedPulling="2026-04-19 12:30:00.565859581 +0000 UTC m=+1214.633031450" lastFinishedPulling="2026-04-19 12:30:01.009237826 +0000 UTC m=+1215.076409695" observedRunningTime="2026-04-19 12:30:01.894151151 +0000 UTC m=+1215.961323044" watchObservedRunningTime="2026-04-19 12:30:01.894823944 +0000 UTC m=+1215.961995836" Apr 19 12:30:21.941200 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:21.941110 2572 generic.go:358] "Generic (PLEG): container finished" podID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerID="f976b0907301863f79a5a8424d000d4d6f6b780cd94a7c3417a5d8518549ee15" exitCode=6 Apr 19 12:30:21.941567 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:21.941193 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerDied","Data":"f976b0907301863f79a5a8424d000d4d6f6b780cd94a7c3417a5d8518549ee15"} Apr 19 12:30:21.941567 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:21.941506 2572 scope.go:117] "RemoveContainer" containerID="f976b0907301863f79a5a8424d000d4d6f6b780cd94a7c3417a5d8518549ee15" Apr 19 12:30:22.945971 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:22.945936 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerStarted","Data":"a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b"} Apr 19 12:30:43.007047 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:43.007015 2572 generic.go:358] "Generic (PLEG): container finished" podID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerID="a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b" exitCode=6 Apr 19 12:30:43.007519 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:43.007087 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerDied","Data":"a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b"} Apr 19 12:30:43.007519 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:43.007128 2572 scope.go:117] "RemoveContainer" containerID="f976b0907301863f79a5a8424d000d4d6f6b780cd94a7c3417a5d8518549ee15" Apr 19 12:30:43.007519 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:43.007478 2572 scope.go:117] "RemoveContainer" containerID="a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b" Apr 19 12:30:43.007722 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:30:43.007700 2572 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cleanup\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cleanup pod=maas-api-key-cleanup-29610030-mxffw_opendatahub(8d1d6812-54b9-41a6-9b65-a84d02b6dd15)\"" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" Apr 19 12:30:53.493287 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:53.493256 2572 scope.go:117] "RemoveContainer" containerID="a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b" Apr 19 12:30:54.042453 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:54.042423 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerStarted","Data":"0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499"} Apr 19 12:30:54.517146 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:54.517112 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610030-mxffw"] Apr 19 12:30:55.046462 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:30:55.046423 2572 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" containerID="cri-o://0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499" gracePeriod=30 Apr 19 12:31:14.277665 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:14.277638 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:31:14.358556 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:14.358464 2572 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzvgk\" (UniqueName: \"kubernetes.io/projected/8d1d6812-54b9-41a6-9b65-a84d02b6dd15-kube-api-access-lzvgk\") pod \"8d1d6812-54b9-41a6-9b65-a84d02b6dd15\" (UID: \"8d1d6812-54b9-41a6-9b65-a84d02b6dd15\") " Apr 19 12:31:14.360478 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:14.360446 2572 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d1d6812-54b9-41a6-9b65-a84d02b6dd15-kube-api-access-lzvgk" (OuterVolumeSpecName: "kube-api-access-lzvgk") pod "8d1d6812-54b9-41a6-9b65-a84d02b6dd15" (UID: "8d1d6812-54b9-41a6-9b65-a84d02b6dd15"). InnerVolumeSpecName "kube-api-access-lzvgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 19 12:31:14.459741 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:14.459710 2572 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzvgk\" (UniqueName: \"kubernetes.io/projected/8d1d6812-54b9-41a6-9b65-a84d02b6dd15-kube-api-access-lzvgk\") on node \"ip-10-0-140-237.ec2.internal\" DevicePath \"\"" Apr 19 12:31:15.104869 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.104829 2572 generic.go:358] "Generic (PLEG): container finished" podID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerID="0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499" exitCode=6 Apr 19 12:31:15.105026 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.104880 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerDied","Data":"0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499"} Apr 19 12:31:15.105026 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.104911 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" event={"ID":"8d1d6812-54b9-41a6-9b65-a84d02b6dd15","Type":"ContainerDied","Data":"2da21aff93c2d804a41ef797d115f615abe00ff6ce90d7a646dc1d2b28510f4a"} Apr 19 12:31:15.105026 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.104931 2572 scope.go:117] "RemoveContainer" containerID="0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499" Apr 19 12:31:15.105026 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.104932 2572 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-key-cleanup-29610030-mxffw" Apr 19 12:31:15.112401 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.112378 2572 scope.go:117] "RemoveContainer" containerID="a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b" Apr 19 12:31:15.119116 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.119092 2572 scope.go:117] "RemoveContainer" containerID="0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499" Apr 19 12:31:15.119390 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:31:15.119369 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499\": container with ID starting with 0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499 not found: ID does not exist" containerID="0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499" Apr 19 12:31:15.119481 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.119402 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499"} err="failed to get container status \"0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499\": rpc error: code = NotFound desc = could not find container \"0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499\": container with ID starting with 0b6565d48aef7f5fef6420b2319d7524386bcded86d8b2798e381d3a92f95499 not found: ID does not exist" Apr 19 12:31:15.119481 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.119427 2572 scope.go:117] "RemoveContainer" containerID="a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b" Apr 19 12:31:15.119801 ip-10-0-140-237 kubenswrapper[2572]: E0419 12:31:15.119721 2572 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b\": container with ID starting with a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b not found: ID does not exist" containerID="a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b" Apr 19 12:31:15.119801 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.119790 2572 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b"} err="failed to get container status \"a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b\": rpc error: code = NotFound desc = could not find container \"a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b\": container with ID starting with a31d2dc02c30ceebfd8017cabeb4eb8b931231384ca390d55cc09b965ccaa70b not found: ID does not exist" Apr 19 12:31:15.120486 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.120462 2572 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610030-mxffw"] Apr 19 12:31:15.121772 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:15.121740 2572 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-key-cleanup-29610030-mxffw"] Apr 19 12:31:16.497006 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:31:16.496974 2572 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" path="/var/lib/kubelet/pods/8d1d6812-54b9-41a6-9b65-a84d02b6dd15/volumes" Apr 19 12:44:04.319250 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:04.319219 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-759f9fb8f4-89n2w_62fe26ea-15ad-4eea-8880-6a2db0a245e8/authorino/0.log" Apr 19 12:44:08.011018 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:08.010920 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-54dc75dff7-kzq4l_18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d/manager/0.log" Apr 19 12:44:08.118030 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:08.117999 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-v9pnf_43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047/manager/2.log" Apr 19 12:44:08.233668 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:08.233632 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-676bcb86f4-67clv_15457b9f-fff7-48f5-ab7d-7e79bb792e93/manager/0.log" Apr 19 12:44:09.729529 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:09.729487 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-759f9fb8f4-89n2w_62fe26ea-15ad-4eea-8880-6a2db0a245e8/authorino/0.log" Apr 19 12:44:11.077261 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:11.077201 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-58497579d8-6w699_00f08ae1-8940-40d9-b49e-4c6be9ca6c92/kube-auth-proxy/0.log" Apr 19 12:44:15.860053 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860021 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxrl8/must-gather-kl8t9"] Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860321 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860332 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860344 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860350 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860358 2572 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860364 2572 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860411 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860430 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860418 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.860661 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.860508 2572 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d1d6812-54b9-41a6-9b65-a84d02b6dd15" containerName="cleanup" Apr 19 12:44:15.863283 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.863264 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:15.865893 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.865875 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxrl8\"/\"openshift-service-ca.crt\"" Apr 19 12:44:15.866998 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.866978 2572 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kxrl8\"/\"default-dockercfg-qptrb\"" Apr 19 12:44:15.867091 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.866985 2572 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kxrl8\"/\"kube-root-ca.crt\"" Apr 19 12:44:15.877878 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.877859 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/must-gather-kl8t9"] Apr 19 12:44:15.959131 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.959103 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9459eebf-a3ea-4303-bb9a-83d9d95879d7-must-gather-output\") pod \"must-gather-kl8t9\" (UID: \"9459eebf-a3ea-4303-bb9a-83d9d95879d7\") " pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:15.959228 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:15.959169 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z52x7\" (UniqueName: \"kubernetes.io/projected/9459eebf-a3ea-4303-bb9a-83d9d95879d7-kube-api-access-z52x7\") pod \"must-gather-kl8t9\" (UID: \"9459eebf-a3ea-4303-bb9a-83d9d95879d7\") " pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:16.060457 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.060427 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9459eebf-a3ea-4303-bb9a-83d9d95879d7-must-gather-output\") pod \"must-gather-kl8t9\" (UID: \"9459eebf-a3ea-4303-bb9a-83d9d95879d7\") " pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:16.060594 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.060483 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z52x7\" (UniqueName: \"kubernetes.io/projected/9459eebf-a3ea-4303-bb9a-83d9d95879d7-kube-api-access-z52x7\") pod \"must-gather-kl8t9\" (UID: \"9459eebf-a3ea-4303-bb9a-83d9d95879d7\") " pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:16.060748 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.060727 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9459eebf-a3ea-4303-bb9a-83d9d95879d7-must-gather-output\") pod \"must-gather-kl8t9\" (UID: \"9459eebf-a3ea-4303-bb9a-83d9d95879d7\") " pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:16.071331 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.071310 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z52x7\" (UniqueName: \"kubernetes.io/projected/9459eebf-a3ea-4303-bb9a-83d9d95879d7-kube-api-access-z52x7\") pod \"must-gather-kl8t9\" (UID: \"9459eebf-a3ea-4303-bb9a-83d9d95879d7\") " pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:16.172682 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.172623 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/must-gather-kl8t9" Apr 19 12:44:16.285065 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.285022 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/must-gather-kl8t9"] Apr 19 12:44:16.287563 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:44:16.287535 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9459eebf_a3ea_4303_bb9a_83d9d95879d7.slice/crio-9a805ed14750a7e0305c803b20845be592d2b70349bc23a644e920b3b2718ee0 WatchSource:0}: Error finding container 9a805ed14750a7e0305c803b20845be592d2b70349bc23a644e920b3b2718ee0: Status 404 returned error can't find the container with id 9a805ed14750a7e0305c803b20845be592d2b70349bc23a644e920b3b2718ee0 Apr 19 12:44:16.289411 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:16.289392 2572 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 19 12:44:17.215987 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:17.215960 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/must-gather-kl8t9" event={"ID":"9459eebf-a3ea-4303-bb9a-83d9d95879d7","Type":"ContainerStarted","Data":"9a805ed14750a7e0305c803b20845be592d2b70349bc23a644e920b3b2718ee0"} Apr 19 12:44:18.223274 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:18.223233 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/must-gather-kl8t9" event={"ID":"9459eebf-a3ea-4303-bb9a-83d9d95879d7","Type":"ContainerStarted","Data":"f8df8f04142b0ab569a7fbd45a69a8c1a112e77866726db085d36419d2205a3d"} Apr 19 12:44:18.223274 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:18.223278 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/must-gather-kl8t9" event={"ID":"9459eebf-a3ea-4303-bb9a-83d9d95879d7","Type":"ContainerStarted","Data":"ab3fe4c344469dae300c839b816f8df86663e615bd03930a5c62a6143884164a"} Apr 19 12:44:18.239426 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:18.239368 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxrl8/must-gather-kl8t9" podStartSLOduration=2.389417793 podStartE2EDuration="3.23934858s" podCreationTimestamp="2026-04-19 12:44:15 +0000 UTC" firstStartedPulling="2026-04-19 12:44:16.289610979 +0000 UTC m=+2070.356782861" lastFinishedPulling="2026-04-19 12:44:17.139541776 +0000 UTC m=+2071.206713648" observedRunningTime="2026-04-19 12:44:18.237452472 +0000 UTC m=+2072.304624367" watchObservedRunningTime="2026-04-19 12:44:18.23934858 +0000 UTC m=+2072.306520472" Apr 19 12:44:18.653031 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:18.652996 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-pccgh_4bbc95aa-4b56-4ddc-ab2c-7b9abba9aa82/global-pull-secret-syncer/0.log" Apr 19 12:44:18.797531 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:18.797497 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wbtqf_a78a63a6-1b78-4253-87ed-341aaf44e16f/konnectivity-agent/0.log" Apr 19 12:44:18.873104 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:18.873078 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-140-237.ec2.internal_5c7cf47197c65590d0f81e01fc4711d8/haproxy/0.log" Apr 19 12:44:23.316421 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:23.316336 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-759f9fb8f4-89n2w_62fe26ea-15ad-4eea-8880-6a2db0a245e8/authorino/0.log" Apr 19 12:44:25.377911 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:25.377880 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8pqx_8f904272-e411-4995-aacd-579497cea66e/node-exporter/0.log" Apr 19 12:44:25.396495 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:25.396458 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8pqx_8f904272-e411-4995-aacd-579497cea66e/kube-rbac-proxy/0.log" Apr 19 12:44:25.415030 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:25.415005 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-p8pqx_8f904272-e411-4995-aacd-579497cea66e/init-textfile/0.log" Apr 19 12:44:27.195034 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.194996 2572 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7"] Apr 19 12:44:27.201463 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.201437 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.207082 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.207055 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7"] Apr 19 12:44:27.262263 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.262211 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-proc\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.262441 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.262284 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-lib-modules\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.262441 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.262329 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-podres\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.262441 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.262378 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lz7vk\" (UniqueName: \"kubernetes.io/projected/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-kube-api-access-lz7vk\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.262615 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.262480 2572 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-sys\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.363721 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.363678 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-proc\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.363910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.363740 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-lib-modules\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.363910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.363788 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-podres\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.363910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.363832 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lz7vk\" (UniqueName: \"kubernetes.io/projected/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-kube-api-access-lz7vk\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.363910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.363889 2572 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-sys\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.364134 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.363994 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-sys\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.364134 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.364051 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-proc\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.364241 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.364157 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-lib-modules\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.364241 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.364227 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-podres\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.372692 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.372662 2572 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lz7vk\" (UniqueName: \"kubernetes.io/projected/3dc71a52-96e6-4a32-88a7-ae658c03ed6d-kube-api-access-lz7vk\") pod \"perf-node-gather-daemonset-svzn7\" (UID: \"3dc71a52-96e6-4a32-88a7-ae658c03ed6d\") " pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.513130 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.513093 2572 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:27.669056 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:27.669020 2572 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7"] Apr 19 12:44:27.673058 ip-10-0-140-237 kubenswrapper[2572]: W0419 12:44:27.673016 2572 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3dc71a52_96e6_4a32_88a7_ae658c03ed6d.slice/crio-62c0244693b6c340de2ecb3795c5fc92ec56f9a2aa095fd90182ca68998e09c0 WatchSource:0}: Error finding container 62c0244693b6c340de2ecb3795c5fc92ec56f9a2aa095fd90182ca68998e09c0: Status 404 returned error can't find the container with id 62c0244693b6c340de2ecb3795c5fc92ec56f9a2aa095fd90182ca68998e09c0 Apr 19 12:44:28.260317 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:28.260279 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" event={"ID":"3dc71a52-96e6-4a32-88a7-ae658c03ed6d","Type":"ContainerStarted","Data":"b5a6cc1a8a4ca1a0286d9b3c48f1ac39898d153bd12891a2358a3f05330e7d14"} Apr 19 12:44:28.260930 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:28.260898 2572 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" event={"ID":"3dc71a52-96e6-4a32-88a7-ae658c03ed6d","Type":"ContainerStarted","Data":"62c0244693b6c340de2ecb3795c5fc92ec56f9a2aa095fd90182ca68998e09c0"} Apr 19 12:44:28.261105 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:28.261087 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:28.275823 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:28.275754 2572 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" podStartSLOduration=1.2757391820000001 podStartE2EDuration="1.275739182s" podCreationTimestamp="2026-04-19 12:44:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-19 12:44:28.274536156 +0000 UTC m=+2082.341708048" watchObservedRunningTime="2026-04-19 12:44:28.275739182 +0000 UTC m=+2082.342911073" Apr 19 12:44:29.019302 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:29.019268 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-447kt_f8b2bdcd-f07f-423f-a401-b0743ef1671f/dns/0.log" Apr 19 12:44:29.037973 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:29.037947 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-447kt_f8b2bdcd-f07f-423f-a401-b0743ef1671f/kube-rbac-proxy/0.log" Apr 19 12:44:29.178566 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:29.178540 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m897s_d6bccaec-31af-4abb-8335-372cc94da827/dns-node-resolver/0.log" Apr 19 12:44:29.590311 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:29.590281 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-56f9f776b7-q295c_8a059cfb-bede-4767-80e4-ef7061d18b06/registry/0.log" Apr 19 12:44:29.606163 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:29.606138 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-56f9f776b7-q295c_8a059cfb-bede-4767-80e4-ef7061d18b06/registry/1.log" Apr 19 12:44:29.624540 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:29.624519 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mdl6z_5982f902-227e-4317-9969-a90aa85d5e40/node-ca/0.log" Apr 19 12:44:30.527024 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:30.526999 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-58497579d8-6w699_00f08ae1-8940-40d9-b49e-4c6be9ca6c92/kube-auth-proxy/0.log" Apr 19 12:44:31.109693 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:31.109661 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-h7xrv_105ea9fb-4797-44a9-a30b-6d58c96ae4d6/serve-healthcheck-canary/0.log" Apr 19 12:44:31.554910 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:31.554881 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6s87n_42a63549-4ec6-4c51-bb2e-54c4888fadad/kube-rbac-proxy/0.log" Apr 19 12:44:31.573644 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:31.573618 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6s87n_42a63549-4ec6-4c51-bb2e-54c4888fadad/exporter/0.log" Apr 19 12:44:31.593068 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:31.593050 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6s87n_42a63549-4ec6-4c51-bb2e-54c4888fadad/extractor/0.log" Apr 19 12:44:33.596465 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:33.596427 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-controller-54dc75dff7-kzq4l_18ee9c5d-eed7-4d50-84a8-6bb5b4207c4d/manager/0.log" Apr 19 12:44:33.614577 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:33.614547 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-v9pnf_43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047/manager/1.log" Apr 19 12:44:33.639249 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:33.639223 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-v9pnf_43b7f7a5-0f1e-4cb2-96ba-8fe84e6af047/manager/2.log" Apr 19 12:44:33.696380 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:33.696346 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-676bcb86f4-67clv_15457b9f-fff7-48f5-ab7d-7e79bb792e93/manager/0.log" Apr 19 12:44:34.836740 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:34.836708 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-688fc496d-xs6fd_ee6e6443-453b-4c15-804a-8ce3d9fdd1e4/manager/0.log" Apr 19 12:44:35.279143 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:35.279113 2572 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kxrl8/perf-node-gather-daemonset-svzn7" Apr 19 12:44:40.955722 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:40.955691 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/kube-multus-additional-cni-plugins/0.log" Apr 19 12:44:40.977680 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:40.977658 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/egress-router-binary-copy/0.log" Apr 19 12:44:40.996108 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:40.996087 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/cni-plugins/0.log" Apr 19 12:44:41.014157 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.014137 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/bond-cni-plugin/0.log" Apr 19 12:44:41.034405 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.034381 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/routeoverride-cni/0.log" Apr 19 12:44:41.052992 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.052969 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/whereabouts-cni-bincopy/0.log" Apr 19 12:44:41.076526 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.076506 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-sqxd4_7944a8e8-84c1-4ed9-81a3-27e357098d48/whereabouts-cni/0.log" Apr 19 12:44:41.104751 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.104719 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-cpvgj_afd1b409-8b1c-4984-996b-e66960d52ffc/kube-multus/0.log" Apr 19 12:44:41.155627 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.155586 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-67dv9_2235efd3-6b65-47c7-acb5-eb9aa104beb5/network-metrics-daemon/0.log" Apr 19 12:44:41.172091 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:41.172060 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-67dv9_2235efd3-6b65-47c7-acb5-eb9aa104beb5/kube-rbac-proxy/0.log" Apr 19 12:44:42.555053 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.555026 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/ovn-controller/0.log" Apr 19 12:44:42.588259 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.588230 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/ovn-acl-logging/0.log" Apr 19 12:44:42.610976 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.610946 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/kube-rbac-proxy-node/0.log" Apr 19 12:44:42.630404 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.630378 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/kube-rbac-proxy-ovn-metrics/0.log" Apr 19 12:44:42.647256 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.647228 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/northd/0.log" Apr 19 12:44:42.664593 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.664572 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/nbdb/0.log" Apr 19 12:44:42.683202 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.683183 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/sbdb/0.log" Apr 19 12:44:42.866495 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:42.866421 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-mfzf5_473da951-aee0-495d-9114-913f8c86e8e0/ovnkube-controller/0.log" Apr 19 12:44:43.842201 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:43.842167 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-2v66h_29c3436d-ad1d-4386-a9d8-894a3c87dbfc/network-check-target-container/0.log" Apr 19 12:44:44.848007 ip-10-0-140-237 kubenswrapper[2572]: I0419 12:44:44.847977 2572 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-qbr6x_6799f9ef-a53a-465c-ae04-e93d43acaea2/iptables-alerter/0.log"