Apr 20 07:00:19.969963 ip-10-0-130-105 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 20 07:00:19.969976 ip-10-0-130-105 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 20 07:00:19.969985 ip-10-0-130-105 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 20 07:00:19.970280 ip-10-0-130-105 systemd[1]: Failed to start Kubernetes Kubelet. Apr 20 07:00:30.187197 ip-10-0-130-105 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 20 07:00:30.187214 ip-10-0-130-105 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 7bcbd8507ed84fcb85b51ad25ccb9a15 -- Apr 20 07:02:45.065917 ip-10-0-130-105 systemd[1]: Starting Kubernetes Kubelet... Apr 20 07:02:45.508743 ip-10-0-130-105 kubenswrapper[2543]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:45.508743 ip-10-0-130-105 kubenswrapper[2543]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 20 07:02:45.508743 ip-10-0-130-105 kubenswrapper[2543]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:45.508743 ip-10-0-130-105 kubenswrapper[2543]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 20 07:02:45.508743 ip-10-0-130-105 kubenswrapper[2543]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 20 07:02:45.512103 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.511876 2543 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 20 07:02:45.518548 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518526 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:45.518548 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518546 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:45.518548 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518551 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518557 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518562 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518566 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518570 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518574 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518577 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518581 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518584 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518589 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518592 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518596 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518601 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518605 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518609 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518613 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518617 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518621 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518626 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518631 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:45.518769 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518635 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518654 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518657 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518662 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518667 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518672 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518676 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518682 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518689 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518694 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518698 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518702 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518706 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518711 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518716 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518720 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518724 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518728 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518732 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:45.519570 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518736 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518741 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518745 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518749 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518755 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518759 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518764 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518767 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518771 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518775 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518780 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518784 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518789 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518793 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518797 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518801 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518806 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518810 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518815 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518819 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:45.520443 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518823 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518827 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518831 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518836 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518840 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518845 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518849 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518854 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518861 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518867 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518872 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518876 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518882 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518886 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518891 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518895 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518899 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518904 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518909 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:45.521298 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518913 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518917 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518923 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518927 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518932 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.518936 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519551 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519560 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519565 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519571 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519577 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519582 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519587 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519591 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519596 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519600 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519604 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519608 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519612 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:45.521891 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519616 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519621 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519625 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519629 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519633 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519654 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519659 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519663 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519667 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519672 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519678 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519685 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519690 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519701 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519706 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519712 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519717 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519721 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519725 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:45.522350 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519730 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519734 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519738 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519743 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519747 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519751 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519756 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519760 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519764 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519768 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519772 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519777 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519781 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519784 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519789 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519793 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519797 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519802 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519806 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519810 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519814 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:45.522927 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519819 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519822 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519826 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519830 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519835 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519840 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519844 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519848 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519852 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519856 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519860 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519864 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519874 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519879 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519883 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519888 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519892 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519896 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519901 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519905 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:45.523710 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519909 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519913 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519918 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519922 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519926 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519930 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519935 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519939 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519942 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519946 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519952 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519956 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.519960 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520063 2543 flags.go:64] FLAG: --address="0.0.0.0" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520074 2543 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520084 2543 flags.go:64] FLAG: --anonymous-auth="true" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520092 2543 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520101 2543 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520107 2543 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520114 2543 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520121 2543 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 20 07:02:45.524245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520126 2543 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520131 2543 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520136 2543 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520144 2543 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520150 2543 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520154 2543 flags.go:64] FLAG: --cgroup-root="" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520159 2543 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520164 2543 flags.go:64] FLAG: --client-ca-file="" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520169 2543 flags.go:64] FLAG: --cloud-config="" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520174 2543 flags.go:64] FLAG: --cloud-provider="external" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520178 2543 flags.go:64] FLAG: --cluster-dns="[]" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520186 2543 flags.go:64] FLAG: --cluster-domain="" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520190 2543 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520195 2543 flags.go:64] FLAG: --config-dir="" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520200 2543 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520206 2543 flags.go:64] FLAG: --container-log-max-files="5" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520214 2543 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520219 2543 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520224 2543 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520230 2543 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520234 2543 flags.go:64] FLAG: --contention-profiling="false" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520239 2543 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520244 2543 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520249 2543 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520254 2543 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 20 07:02:45.524804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520261 2543 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520266 2543 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520270 2543 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520275 2543 flags.go:64] FLAG: --enable-load-reader="false" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520281 2543 flags.go:64] FLAG: --enable-server="true" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520286 2543 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520293 2543 flags.go:64] FLAG: --event-burst="100" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520298 2543 flags.go:64] FLAG: --event-qps="50" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520304 2543 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520309 2543 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520315 2543 flags.go:64] FLAG: --eviction-hard="" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520321 2543 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520326 2543 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520331 2543 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520336 2543 flags.go:64] FLAG: --eviction-soft="" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520342 2543 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520347 2543 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520351 2543 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520356 2543 flags.go:64] FLAG: --experimental-mounter-path="" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520361 2543 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520366 2543 flags.go:64] FLAG: --fail-swap-on="true" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520371 2543 flags.go:64] FLAG: --feature-gates="" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520377 2543 flags.go:64] FLAG: --file-check-frequency="20s" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520383 2543 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520389 2543 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 20 07:02:45.525499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520394 2543 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520399 2543 flags.go:64] FLAG: --healthz-port="10248" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520404 2543 flags.go:64] FLAG: --help="false" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520408 2543 flags.go:64] FLAG: --hostname-override="ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520414 2543 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520419 2543 flags.go:64] FLAG: --http-check-frequency="20s" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520423 2543 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520429 2543 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520434 2543 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520439 2543 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520443 2543 flags.go:64] FLAG: --image-service-endpoint="" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520448 2543 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520453 2543 flags.go:64] FLAG: --kube-api-burst="100" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520458 2543 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520462 2543 flags.go:64] FLAG: --kube-api-qps="50" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520467 2543 flags.go:64] FLAG: --kube-reserved="" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520472 2543 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520478 2543 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520484 2543 flags.go:64] FLAG: --kubelet-cgroups="" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520489 2543 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520493 2543 flags.go:64] FLAG: --lock-file="" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520498 2543 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520503 2543 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520507 2543 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 20 07:02:45.526151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520516 2543 flags.go:64] FLAG: --log-json-split-stream="false" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520521 2543 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520526 2543 flags.go:64] FLAG: --log-text-split-stream="false" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520531 2543 flags.go:64] FLAG: --logging-format="text" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520535 2543 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520541 2543 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520547 2543 flags.go:64] FLAG: --manifest-url="" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520551 2543 flags.go:64] FLAG: --manifest-url-header="" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520559 2543 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520564 2543 flags.go:64] FLAG: --max-open-files="1000000" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520579 2543 flags.go:64] FLAG: --max-pods="110" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520584 2543 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520589 2543 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520594 2543 flags.go:64] FLAG: --memory-manager-policy="None" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520599 2543 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520604 2543 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520609 2543 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520613 2543 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520625 2543 flags.go:64] FLAG: --node-status-max-images="50" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520630 2543 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520635 2543 flags.go:64] FLAG: --oom-score-adj="-999" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520660 2543 flags.go:64] FLAG: --pod-cidr="" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520665 2543 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 20 07:02:45.526796 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520675 2543 flags.go:64] FLAG: --pod-manifest-path="" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520680 2543 flags.go:64] FLAG: --pod-max-pids="-1" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520684 2543 flags.go:64] FLAG: --pods-per-core="0" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520691 2543 flags.go:64] FLAG: --port="10250" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520696 2543 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520701 2543 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-071714f1a30dc0157" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520727 2543 flags.go:64] FLAG: --qos-reserved="" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520734 2543 flags.go:64] FLAG: --read-only-port="10255" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520740 2543 flags.go:64] FLAG: --register-node="true" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520745 2543 flags.go:64] FLAG: --register-schedulable="true" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520750 2543 flags.go:64] FLAG: --register-with-taints="" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520757 2543 flags.go:64] FLAG: --registry-burst="10" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520761 2543 flags.go:64] FLAG: --registry-qps="5" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520766 2543 flags.go:64] FLAG: --reserved-cpus="" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520770 2543 flags.go:64] FLAG: --reserved-memory="" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520778 2543 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520783 2543 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520788 2543 flags.go:64] FLAG: --rotate-certificates="false" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520794 2543 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520798 2543 flags.go:64] FLAG: --runonce="false" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520803 2543 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520808 2543 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520813 2543 flags.go:64] FLAG: --seccomp-default="false" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520817 2543 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520822 2543 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520827 2543 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 20 07:02:45.527342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520832 2543 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520837 2543 flags.go:64] FLAG: --storage-driver-password="root" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520842 2543 flags.go:64] FLAG: --storage-driver-secure="false" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520847 2543 flags.go:64] FLAG: --storage-driver-table="stats" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520851 2543 flags.go:64] FLAG: --storage-driver-user="root" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520856 2543 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520861 2543 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520867 2543 flags.go:64] FLAG: --system-cgroups="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520871 2543 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520883 2543 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520887 2543 flags.go:64] FLAG: --tls-cert-file="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520892 2543 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520899 2543 flags.go:64] FLAG: --tls-min-version="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520904 2543 flags.go:64] FLAG: --tls-private-key-file="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520909 2543 flags.go:64] FLAG: --topology-manager-policy="none" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520914 2543 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520919 2543 flags.go:64] FLAG: --topology-manager-scope="container" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520923 2543 flags.go:64] FLAG: --v="2" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520930 2543 flags.go:64] FLAG: --version="false" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520936 2543 flags.go:64] FLAG: --vmodule="" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520942 2543 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.520950 2543 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521099 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521105 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:45.528036 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521111 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521115 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521119 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521124 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521128 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521132 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521137 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521141 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521145 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521151 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521157 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521161 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521166 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521170 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521174 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521178 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521183 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521189 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521193 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521198 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:45.528664 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521202 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521206 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521210 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521214 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521218 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521222 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521227 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521231 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521236 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521245 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521250 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521254 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521258 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521262 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521266 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521271 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521275 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521280 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521284 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521288 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:45.529180 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521292 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521296 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521300 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521303 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521307 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521311 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521315 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521320 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521324 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521331 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521335 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521339 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521343 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521348 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521352 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521356 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521360 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521364 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521369 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521373 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:45.529697 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521377 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521382 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521386 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521390 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521394 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521399 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521402 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521406 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521411 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521415 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521420 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521424 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521428 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521432 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521436 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521441 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521445 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521451 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521457 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521462 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:45.530265 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521467 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521474 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521479 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.521484 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.522659 2543 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.529141 2543 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.529157 2543 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529206 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529211 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529215 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529218 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529221 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529225 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529227 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529230 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:45.530773 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529234 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529239 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529241 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529244 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529247 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529250 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529252 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529255 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529258 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529261 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529264 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529270 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529274 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529277 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529280 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529282 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529285 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529287 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529290 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529292 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:45.531148 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529295 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529298 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529301 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529305 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529308 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529311 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529313 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529316 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529318 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529321 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529323 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529326 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529329 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529332 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529334 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529337 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529339 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529342 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529345 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529347 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:45.531628 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529350 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529352 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529355 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529357 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529360 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529363 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529366 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529368 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529371 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529373 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529376 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529378 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529381 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529384 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529386 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529390 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529393 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529395 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529398 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529400 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:45.532173 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529403 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529406 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529410 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529413 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529415 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529418 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529420 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529423 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529425 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529427 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529430 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529432 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529435 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529437 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529440 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529442 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529445 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:45.532680 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529448 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.529453 2543 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529550 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529555 2543 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529558 2543 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529561 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529564 2543 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529566 2543 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529569 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529572 2543 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529575 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529578 2543 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529581 2543 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529584 2543 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529587 2543 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 20 07:02:45.533097 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529589 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529592 2543 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529595 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529597 2543 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529599 2543 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529602 2543 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529605 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529607 2543 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529610 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529612 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529615 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529618 2543 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529620 2543 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529623 2543 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529625 2543 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529628 2543 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529631 2543 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529635 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529655 2543 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529658 2543 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 20 07:02:45.533473 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529660 2543 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529663 2543 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529666 2543 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529669 2543 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529671 2543 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529674 2543 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529677 2543 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529680 2543 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529682 2543 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529685 2543 feature_gate.go:328] unrecognized feature gate: Example2 Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529688 2543 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529691 2543 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529694 2543 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529696 2543 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529699 2543 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529704 2543 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529708 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529712 2543 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529716 2543 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529720 2543 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 20 07:02:45.533982 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529724 2543 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529728 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529732 2543 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529737 2543 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529741 2543 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529745 2543 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529749 2543 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529753 2543 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529757 2543 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529762 2543 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529766 2543 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529770 2543 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529775 2543 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529779 2543 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529784 2543 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529788 2543 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529794 2543 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529798 2543 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529802 2543 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529807 2543 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 20 07:02:45.534467 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529811 2543 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529816 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529820 2543 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529826 2543 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529830 2543 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529835 2543 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529839 2543 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529846 2543 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529852 2543 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529857 2543 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529861 2543 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529866 2543 feature_gate.go:328] unrecognized feature gate: Example Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:45.529870 2543 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.529877 2543 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.530683 2543 server.go:962] "Client rotation is on, will bootstrap in background" Apr 20 07:02:45.535048 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.533257 2543 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 20 07:02:45.535446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.534122 2543 server.go:1019] "Starting client certificate rotation" Apr 20 07:02:45.535446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.534225 2543 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:02:45.535446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.534268 2543 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 20 07:02:45.560334 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.560317 2543 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:02:45.563943 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.563915 2543 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 20 07:02:45.581381 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.581363 2543 log.go:25] "Validated CRI v1 runtime API" Apr 20 07:02:45.586697 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.586679 2543 log.go:25] "Validated CRI v1 image API" Apr 20 07:02:45.589087 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.589061 2543 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 20 07:02:45.590370 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.590354 2543 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:02:45.593812 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.593785 2543 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 9244f0f6-8353-458b-9950-df6ba947db09:/dev/nvme0n1p4 a12b4ccb-b9ea-4f75-be3d-2733834d187f:/dev/nvme0n1p3] Apr 20 07:02:45.593884 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.593811 2543 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 20 07:02:45.599885 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.599776 2543 manager.go:217] Machine: {Timestamp:2026-04-20 07:02:45.597557484 +0000 UTC m=+0.417625811 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3024981 MemoryCapacity:33164500992 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec28c9f3a0a9643e8618650a834ab513 SystemUUID:ec28c9f3-a0a9-643e-8618-650a834ab513 BootID:7bcbd850-7ed8-4fcb-85b5-1ad25ccb9a15 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582250496 Type:vfs Inodes:4048401 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:40:d0:e2:a1:45 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:40:d0:e2:a1:45 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:06:db:39:60:d9:7c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164500992 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 20 07:02:45.599885 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.599885 2543 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 20 07:02:45.600017 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.600006 2543 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 20 07:02:45.600996 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.600973 2543 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 20 07:02:45.601133 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.600998 2543 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-130-105.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 20 07:02:45.601178 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.601142 2543 topology_manager.go:138] "Creating topology manager with none policy" Apr 20 07:02:45.601178 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.601150 2543 container_manager_linux.go:306] "Creating device plugin manager" Apr 20 07:02:45.601178 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.601162 2543 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:02:45.601985 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.601974 2543 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 20 07:02:45.603605 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.603595 2543 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:02:45.603740 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.603731 2543 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 20 07:02:45.606971 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.606962 2543 kubelet.go:491] "Attempting to sync node with API server" Apr 20 07:02:45.607018 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.606980 2543 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 20 07:02:45.607018 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.606995 2543 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 20 07:02:45.607018 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.607005 2543 kubelet.go:397] "Adding apiserver pod source" Apr 20 07:02:45.607018 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.607014 2543 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 20 07:02:45.608054 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.608042 2543 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:02:45.608096 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.608060 2543 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 20 07:02:45.612873 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.612846 2543 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8l7pk" Apr 20 07:02:45.613759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.613743 2543 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 20 07:02:45.615165 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.615148 2543 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 20 07:02:45.616803 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616787 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 20 07:02:45.616803 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616806 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616813 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616819 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616824 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616829 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616835 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616840 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616848 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616854 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616862 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 20 07:02:45.616913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.616871 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 20 07:02:45.617654 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.617628 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 20 07:02:45.617654 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.617655 2543 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 20 07:02:45.617881 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.617858 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 20 07:02:45.617989 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.617971 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-130-105.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 20 07:02:45.620819 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.620795 2543 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8l7pk" Apr 20 07:02:45.621810 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.621795 2543 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-130-105.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 20 07:02:45.622350 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.622337 2543 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 20 07:02:45.622402 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.622380 2543 server.go:1295] "Started kubelet" Apr 20 07:02:45.622472 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.622449 2543 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 20 07:02:45.622548 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.622510 2543 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 20 07:02:45.622602 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.622562 2543 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 20 07:02:45.623309 ip-10-0-130-105 systemd[1]: Started Kubernetes Kubelet. Apr 20 07:02:45.623908 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.623897 2543 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 20 07:02:45.625146 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.625130 2543 server.go:317] "Adding debug handlers to kubelet server" Apr 20 07:02:45.631709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.631679 2543 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 20 07:02:45.631831 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.631813 2543 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 20 07:02:45.632582 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.632564 2543 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 20 07:02:45.632680 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.632586 2543 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 20 07:02:45.632735 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.632701 2543 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 20 07:02:45.632788 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.632754 2543 reconstruct.go:97] "Volume reconstruction finished" Apr 20 07:02:45.632788 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.632761 2543 reconciler.go:26] "Reconciler: start to sync state" Apr 20 07:02:45.633244 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.633209 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:45.633432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633418 2543 factory.go:153] Registering CRI-O factory Apr 20 07:02:45.633564 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633536 2543 factory.go:223] Registration of the crio container factory successfully Apr 20 07:02:45.633683 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633596 2543 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 20 07:02:45.633683 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.633498 2543 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 20 07:02:45.633683 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633607 2543 factory.go:55] Registering systemd factory Apr 20 07:02:45.633683 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633632 2543 factory.go:223] Registration of the systemd container factory successfully Apr 20 07:02:45.633683 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633676 2543 factory.go:103] Registering Raw factory Apr 20 07:02:45.633903 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.633697 2543 manager.go:1196] Started watching for new ooms in manager Apr 20 07:02:45.634164 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.634151 2543 manager.go:319] Starting recovery of all containers Apr 20 07:02:45.635047 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.635027 2543 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:45.638479 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.638458 2543 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-130-105.ec2.internal\" not found" node="ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.644493 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.644470 2543 manager.go:324] Recovery completed Apr 20 07:02:45.649102 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.649089 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:45.652233 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.652217 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:45.652322 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.652244 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:45.652322 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.652254 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:45.652754 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.652737 2543 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 20 07:02:45.652834 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.652753 2543 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 20 07:02:45.652834 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.652772 2543 state_mem.go:36] "Initialized new in-memory state store" Apr 20 07:02:45.654817 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.654803 2543 policy_none.go:49] "None policy: Start" Apr 20 07:02:45.654892 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.654822 2543 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 20 07:02:45.654892 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.654835 2543 state_mem.go:35] "Initializing new in-memory state store" Apr 20 07:02:45.694994 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.694960 2543 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695208 2543 manager.go:341] "Starting Device Plugin manager" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.695263 2543 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695273 2543 server.go:85] "Starting device plugin registration server" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695558 2543 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695568 2543 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695711 2543 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695779 2543 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.695786 2543 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.696484 2543 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.696508 2543 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.696538 2543 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.696548 2543 kubelet.go:2451] "Starting kubelet main sync loop" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.696610 2543 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.697096 2543 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.697147 2543 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:45.700939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.699203 2543 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:45.796693 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.796612 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:45.796799 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.796690 2543 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal"] Apr 20 07:02:45.796837 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.796814 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:45.797558 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797541 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:45.797558 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797555 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:45.797665 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797571 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:45.797665 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797588 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:45.797729 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797575 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:45.797729 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797727 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:45.797799 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.797753 2543 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.799787 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.799774 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:45.799921 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.799908 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.799965 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.799957 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:45.800677 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.800655 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:45.800786 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.800685 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:45.800786 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.800656 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:45.800786 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.800725 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:45.800786 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.800743 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:45.800786 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.800697 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:45.802839 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.802822 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.802917 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.802851 2543 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 20 07:02:45.803448 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.803433 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientMemory" Apr 20 07:02:45.803522 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.803458 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasNoDiskPressure" Apr 20 07:02:45.803522 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.803473 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeHasSufficientPID" Apr 20 07:02:45.808054 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.808035 2543 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.808121 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.808065 2543 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-130-105.ec2.internal\": node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:45.817211 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.817190 2543 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-105.ec2.internal\" not found" node="ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.820804 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.820788 2543 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-130-105.ec2.internal\" not found" node="ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.830657 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.830629 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:45.931491 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:45.931463 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:45.933927 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.933909 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9830ed1fbf14c281a00b15700af487ba-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal\" (UID: \"9830ed1fbf14c281a00b15700af487ba\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.934032 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.933932 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9830ed1fbf14c281a00b15700af487ba-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal\" (UID: \"9830ed1fbf14c281a00b15700af487ba\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:45.934032 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:45.933948 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/486c974e6666e028518e848e7f47b828-config\") pod \"kube-apiserver-proxy-ip-10-0-130-105.ec2.internal\" (UID: \"486c974e6666e028518e848e7f47b828\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.031835 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.031808 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.034044 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.034028 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9830ed1fbf14c281a00b15700af487ba-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal\" (UID: \"9830ed1fbf14c281a00b15700af487ba\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.034101 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.034055 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9830ed1fbf14c281a00b15700af487ba-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal\" (UID: \"9830ed1fbf14c281a00b15700af487ba\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.034101 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.034078 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/486c974e6666e028518e848e7f47b828-config\") pod \"kube-apiserver-proxy-ip-10-0-130-105.ec2.internal\" (UID: \"486c974e6666e028518e848e7f47b828\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.034161 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.034120 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/9830ed1fbf14c281a00b15700af487ba-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal\" (UID: \"9830ed1fbf14c281a00b15700af487ba\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.034161 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.034130 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/486c974e6666e028518e848e7f47b828-config\") pod \"kube-apiserver-proxy-ip-10-0-130-105.ec2.internal\" (UID: \"486c974e6666e028518e848e7f47b828\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.034223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.034157 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9830ed1fbf14c281a00b15700af487ba-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal\" (UID: \"9830ed1fbf14c281a00b15700af487ba\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.119570 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.119507 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.123184 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.123167 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.132823 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.132806 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.233198 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.233159 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.333566 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.333529 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.434019 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.433947 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.463240 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.463216 2543 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:46.534921 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.534895 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.534921 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.534915 2543 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 20 07:02:46.535536 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.535044 2543 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:46.535536 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.535076 2543 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:46.535536 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.535076 2543 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 20 07:02:46.622999 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.622702 2543 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-19 06:57:45 +0000 UTC" deadline="2027-12-04 04:41:47.05603571 +0000 UTC" Apr 20 07:02:46.622999 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.622997 2543 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14229h39m0.433045056s" Apr 20 07:02:46.632379 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.632359 2543 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 20 07:02:46.634983 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.634961 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.637630 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:46.637603 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9830ed1fbf14c281a00b15700af487ba.slice/crio-49409220994faedd90210ffccffecd3b647319af7ecdafd13ebc1ca06de49bd4 WatchSource:0}: Error finding container 49409220994faedd90210ffccffecd3b647319af7ecdafd13ebc1ca06de49bd4: Status 404 returned error can't find the container with id 49409220994faedd90210ffccffecd3b647319af7ecdafd13ebc1ca06de49bd4 Apr 20 07:02:46.638100 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:46.638084 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod486c974e6666e028518e848e7f47b828.slice/crio-f5d1faafd953101254af691507eed42c377d6aadbd71a816e39c8665af3dab4c WatchSource:0}: Error finding container f5d1faafd953101254af691507eed42c377d6aadbd71a816e39c8665af3dab4c: Status 404 returned error can't find the container with id f5d1faafd953101254af691507eed42c377d6aadbd71a816e39c8665af3dab4c Apr 20 07:02:46.643033 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.643017 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:02:46.647564 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.647547 2543 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 20 07:02:46.674229 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.674209 2543 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jszcb" Apr 20 07:02:46.683190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.683171 2543 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jszcb" Apr 20 07:02:46.700813 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.699458 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" event={"ID":"486c974e6666e028518e848e7f47b828","Type":"ContainerStarted","Data":"f5d1faafd953101254af691507eed42c377d6aadbd71a816e39c8665af3dab4c"} Apr 20 07:02:46.701780 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.701758 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" event={"ID":"9830ed1fbf14c281a00b15700af487ba","Type":"ContainerStarted","Data":"49409220994faedd90210ffccffecd3b647319af7ecdafd13ebc1ca06de49bd4"} Apr 20 07:02:46.735043 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:46.735018 2543 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-130-105.ec2.internal\" not found" Apr 20 07:02:46.771576 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.771553 2543 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:46.832493 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.832465 2543 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.845218 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.845196 2543 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:02:46.846139 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.846126 2543 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" Apr 20 07:02:46.856040 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:46.856020 2543 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 20 07:02:47.608393 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.608352 2543 apiserver.go:52] "Watching apiserver" Apr 20 07:02:47.618331 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.618304 2543 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 20 07:02:47.620319 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.620294 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-cqsc9","kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal","openshift-cluster-node-tuning-operator/tuned-6mpqd","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal","openshift-multus/multus-j9xjk","openshift-multus/network-metrics-daemon-wgd5x","openshift-network-operator/iptables-alerter-7zzm4","openshift-ovn-kubernetes/ovnkube-node-5bfnd","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97","openshift-image-registry/node-ca-d7hcj","openshift-multus/multus-additional-cni-plugins-nfcgq","openshift-network-diagnostics/network-check-target-r9xht"] Apr 20 07:02:47.625260 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.625235 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.627464 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.627439 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.628446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.628022 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 20 07:02:47.628446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.628088 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-8kc5w\"" Apr 20 07:02:47.628446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.628233 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 20 07:02:47.629605 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.629568 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-vkbg7\"" Apr 20 07:02:47.629605 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.629587 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:02:47.629779 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.629689 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 20 07:02:47.630384 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.629997 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.632851 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.632115 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 20 07:02:47.632851 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.632339 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.632851 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.632594 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 20 07:02:47.632851 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.632726 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-xqsng\"" Apr 20 07:02:47.632851 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.632795 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 20 07:02:47.633190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.632931 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 20 07:02:47.635324 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.634974 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 20 07:02:47.635324 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.635239 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.635324 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.635305 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-q9tnl\"" Apr 20 07:02:47.635521 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.635334 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 20 07:02:47.635521 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.635304 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:02:47.637872 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.637853 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 20 07:02:47.637975 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.637948 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 20 07:02:47.638031 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.637993 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 20 07:02:47.638542 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.638522 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:47.638622 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.638596 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:02:47.638701 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.638625 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.640073 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.640047 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 20 07:02:47.640507 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.640483 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-4x97x\"" Apr 20 07:02:47.640940 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.640923 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 20 07:02:47.642192 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642154 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:47.642303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642198 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-node-log\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.642303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642233 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovnkube-config\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.642303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642264 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-kubernetes\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.642303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642296 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-os-release\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.642499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642326 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-host\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.642499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642380 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-cni-multus\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.642499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642405 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.642499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642418 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8328f0e4-59dc-4020-be13-793917ef3ef0-host-slash\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.642499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642453 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-run-netns\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.642499 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642486 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642513 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-systemd\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642542 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-run\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642571 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-sys\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642598 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddbe431f-4738-4893-8014-96e9ed8fedd5-tmp\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642628 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-socket-dir-parent\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642682 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkqnn\" (UniqueName: \"kubernetes.io/projected/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-kube-api-access-qkqnn\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642705 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-kubelet\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642735 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6g44p\" (UniqueName: \"kubernetes.io/projected/8328f0e4-59dc-4020-be13-793917ef3ef0-kube-api-access-6g44p\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.642776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642763 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysctl-conf\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642800 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-var-lib-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642880 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-ovn\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642915 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.642916 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-env-overrides\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643077 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovnkube-script-lib\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643108 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-var-lib-kubelet\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.643157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643139 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6l8dd\" (UniqueName: \"kubernetes.io/projected/ddbe431f-4738-4893-8014-96e9ed8fedd5-kube-api-access-6l8dd\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643169 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kxdf\" (UniqueName: \"kubernetes.io/projected/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-kube-api-access-7kxdf\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643201 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-system-cni-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643229 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-kubelet\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643261 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643315 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9b6dda66-f1f4-4fee-8eb5-1f6447d2d431-agent-certs\") pod \"konnectivity-agent-cqsc9\" (UID: \"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431\") " pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643346 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysconfig\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.643436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643397 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-slash\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643483 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-log-socket\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643528 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-cni-netd\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643563 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-modprobe-d\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.643759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643596 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-systemd-units\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.643759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643659 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9b6dda66-f1f4-4fee-8eb5-1f6447d2d431-konnectivity-ca\") pod \"konnectivity-agent-cqsc9\" (UID: \"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431\") " pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.643759 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643715 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-cnibin\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643768 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-netns\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643803 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-hostroot\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643840 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-conf-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643881 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-systemd\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643913 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovn-node-metrics-cert\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643939 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-lib-modules\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.643968 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysctl-d\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.644008 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644003 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-cni-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644033 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-cni-binary-copy\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644081 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-k8s-cni-cncf-io\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644122 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-daemon-config\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644155 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-multus-certs\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644198 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-etc-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644225 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-cni-bin\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644256 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-cni-bin\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.644328 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644285 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqdbs\" (UniqueName: \"kubernetes.io/projected/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-kube-api-access-hqdbs\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.644672 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644335 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-tuned\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.644672 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644375 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8328f0e4-59dc-4020-be13-793917ef3ef0-iptables-alerter-script\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.644672 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644407 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-etc-kubernetes\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.644672 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644448 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.644857 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644707 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.644983 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.644960 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-wj9m6\"" Apr 20 07:02:47.645169 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645150 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 20 07:02:47.645246 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645228 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 20 07:02:47.645659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645406 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 20 07:02:47.645659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645416 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 20 07:02:47.645659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645421 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 20 07:02:47.645659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645483 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 20 07:02:47.645659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.645609 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-fwvms\"" Apr 20 07:02:47.647334 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.647312 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:47.647439 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.647416 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:47.648162 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.647714 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 20 07:02:47.648162 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.647737 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-8qzzl\"" Apr 20 07:02:47.648162 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.647960 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 20 07:02:47.683859 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.683823 2543 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 06:57:46 +0000 UTC" deadline="2028-01-17 20:37:15.43834112 +0000 UTC" Apr 20 07:02:47.683859 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.683846 2543 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15301h34m27.754498047s" Apr 20 07:02:47.724156 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.724132 2543 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:47.734331 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.734314 2543 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 20 07:02:47.744946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.744918 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-systemd-units\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745053 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.744958 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9b6dda66-f1f4-4fee-8eb5-1f6447d2d431-konnectivity-ca\") pod \"konnectivity-agent-cqsc9\" (UID: \"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431\") " pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.745053 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.744992 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.745053 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745018 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:47.745191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745050 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-systemd-units\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745046 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-cnibin\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745113 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-netns\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745120 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-cnibin\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745138 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-hostroot\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745159 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-netns\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745195 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-conf-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745221 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-systemd\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745242 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-conf-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745251 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovn-node-metrics-cert\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745274 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-hostroot\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745276 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-lib-modules\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745294 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-systemd\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745334 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmf25\" (UniqueName: \"kubernetes.io/projected/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-kube-api-access-nmf25\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.745412 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745398 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-lib-modules\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745442 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745474 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysctl-d\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745513 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-cni-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745537 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-cni-binary-copy\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745559 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9b6dda66-f1f4-4fee-8eb5-1f6447d2d431-konnectivity-ca\") pod \"konnectivity-agent-cqsc9\" (UID: \"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431\") " pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745563 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-k8s-cni-cncf-io\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745602 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-cni-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745602 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-k8s-cni-cncf-io\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745624 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-daemon-config\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745668 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysctl-d\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745670 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-multus-certs\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745695 2543 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745715 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-etc-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745736 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-cni-bin\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745752 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-cni-bin\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745787 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqdbs\" (UniqueName: \"kubernetes.io/projected/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-kube-api-access-hqdbs\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745806 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-cni-bin\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.745874 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745808 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-tuned\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745836 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-device-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745855 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8328f0e4-59dc-4020-be13-793917ef3ef0-iptables-alerter-script\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745875 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-etc-kubernetes\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745703 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-run-multus-certs\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745899 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745931 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-etc-selinux\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745933 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-cni-bin\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745956 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hjmz\" (UniqueName: \"kubernetes.io/projected/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-kube-api-access-9hjmz\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.745983 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746010 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-node-log\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746036 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovnkube-config\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746052 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-etc-kubernetes\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746057 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-kubernetes\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746104 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-kubernetes\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746109 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-system-cni-dir\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746120 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-cni-binary-copy\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.746676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746141 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-run-ovn-kubernetes\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746143 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746167 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-daemon-config\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746179 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-os-release\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746203 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-node-log\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.746221 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746242 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-os-release\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746256 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-etc-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.746300 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:02:48.24626934 +0000 UTC m=+3.066337681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746327 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-host\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746372 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5n5f\" (UniqueName: \"kubernetes.io/projected/dcda05ad-f9cc-4648-90c6-e224049a2518-kube-api-access-m5n5f\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746418 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-cni-multus\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746480 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8328f0e4-59dc-4020-be13-793917ef3ef0-host-slash\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746508 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-run-netns\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746515 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-cni-multus\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746535 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746565 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8328f0e4-59dc-4020-be13-793917ef3ef0-host-slash\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.747427 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746593 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-systemd\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746620 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-run\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746622 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746601 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-run-netns\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746675 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-sys\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746702 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddbe431f-4738-4893-8014-96e9ed8fedd5-tmp\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746712 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-systemd\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746723 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/8328f0e4-59dc-4020-be13-793917ef3ef0-iptables-alerter-script\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746727 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-socket-dir-parent\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746756 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkqnn\" (UniqueName: \"kubernetes.io/projected/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-kube-api-access-qkqnn\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746759 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-run\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746776 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovnkube-config\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746780 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-kubelet\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746810 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-multus-socket-dir-parent\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746780 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-sys\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746844 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dcda05ad-f9cc-4648-90c6-e224049a2518-serviceca\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746863 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-kubelet\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746438 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-host\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746873 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746923 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6g44p\" (UniqueName: \"kubernetes.io/projected/8328f0e4-59dc-4020-be13-793917ef3ef0-kube-api-access-6g44p\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746957 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-registration-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.746984 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-sys-fs\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747006 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcda05ad-f9cc-4648-90c6-e224049a2518-host\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747028 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cnibin\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747052 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysctl-conf\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747090 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-var-lib-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747139 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-var-lib-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747166 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-ovn\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747208 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-env-overrides\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747234 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovnkube-script-lib\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747240 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysctl-conf\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747261 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-ovn\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747300 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-var-lib-kubelet\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747344 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6l8dd\" (UniqueName: \"kubernetes.io/projected/ddbe431f-4738-4893-8014-96e9ed8fedd5-kube-api-access-6l8dd\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747399 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.748797 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747450 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7kxdf\" (UniqueName: \"kubernetes.io/projected/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-kube-api-access-7kxdf\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747478 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-socket-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747539 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-os-release\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747597 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-system-cni-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747625 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-kubelet\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747713 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747744 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9b6dda66-f1f4-4fee-8eb5-1f6447d2d431-agent-certs\") pod \"konnectivity-agent-cqsc9\" (UID: \"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431\") " pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747774 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysconfig\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747779 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovnkube-script-lib\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747802 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-slash\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747829 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-log-socket\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747873 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-system-cni-dir\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747918 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-cni-netd\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.747950 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-modprobe-d\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748040 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-slash\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748073 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-modprobe-d\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748081 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-run-openvswitch\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748088 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-log-socket\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.749590 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748112 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-host-var-lib-kubelet\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.750432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748130 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-env-overrides\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.750432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748244 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-var-lib-kubelet\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.750432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748247 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-host-cni-netd\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.750432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.748259 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-sysconfig\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.750432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.749884 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ddbe431f-4738-4893-8014-96e9ed8fedd5-tmp\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.750432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.750313 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-ovn-node-metrics-cert\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.750730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.750567 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/ddbe431f-4738-4893-8014-96e9ed8fedd5-etc-tuned\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.751489 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.751465 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9b6dda66-f1f4-4fee-8eb5-1f6447d2d431-agent-certs\") pod \"konnectivity-agent-cqsc9\" (UID: \"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431\") " pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.756665 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.756624 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqdbs\" (UniqueName: \"kubernetes.io/projected/13fb38e8-b7f0-4b6a-94e1-f8426903d19f-kube-api-access-hqdbs\") pod \"ovnkube-node-5bfnd\" (UID: \"13fb38e8-b7f0-4b6a-94e1-f8426903d19f\") " pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.759691 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.759660 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kxdf\" (UniqueName: \"kubernetes.io/projected/a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf-kube-api-access-7kxdf\") pod \"multus-j9xjk\" (UID: \"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf\") " pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.759935 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.759915 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkqnn\" (UniqueName: \"kubernetes.io/projected/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-kube-api-access-qkqnn\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:47.760220 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.760194 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6l8dd\" (UniqueName: \"kubernetes.io/projected/ddbe431f-4738-4893-8014-96e9ed8fedd5-kube-api-access-6l8dd\") pod \"tuned-6mpqd\" (UID: \"ddbe431f-4738-4893-8014-96e9ed8fedd5\") " pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.760941 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.760902 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6g44p\" (UniqueName: \"kubernetes.io/projected/8328f0e4-59dc-4020-be13-793917ef3ef0-kube-api-access-6g44p\") pod \"iptables-alerter-7zzm4\" (UID: \"8328f0e4-59dc-4020-be13-793917ef3ef0\") " pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.848742 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848711 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-etc-selinux\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.848742 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848745 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9hjmz\" (UniqueName: \"kubernetes.io/projected/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-kube-api-access-9hjmz\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848772 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-system-cni-dir\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848792 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848817 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5n5f\" (UniqueName: \"kubernetes.io/projected/dcda05ad-f9cc-4648-90c6-e224049a2518-kube-api-access-m5n5f\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848851 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dcda05ad-f9cc-4648-90c6-e224049a2518-serviceca\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848863 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-system-cni-dir\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848876 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848848 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-etc-selinux\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848905 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-registration-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848929 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-sys-fs\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.848956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848950 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcda05ad-f9cc-4648-90c6-e224049a2518-host\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848972 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cnibin\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.848988 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-registration-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849001 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849052 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-socket-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849062 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-sys-fs\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849080 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-os-release\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849105 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/dcda05ad-f9cc-4648-90c6-e224049a2518-host\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849128 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849142 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cnibin\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849157 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849188 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-socket-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849201 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-os-release\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849191 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nmf25\" (UniqueName: \"kubernetes.io/projected/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-kube-api-access-nmf25\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849229 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849259 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-device-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849322 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-device-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.849406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849323 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-tuning-conf-dir\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.850040 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849423 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.850040 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849423 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.850040 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849470 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-kubelet-dir\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.850040 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849750 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-cni-binary-copy\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.850040 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.849862 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/dcda05ad-f9cc-4648-90c6-e224049a2518-serviceca\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.864306 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.864251 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:47.864306 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.864273 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:47.864306 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.864284 2543 projected.go:194] Error preparing data for projected volume kube-api-access-b7bw9 for pod openshift-network-diagnostics/network-check-target-r9xht: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:47.864536 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:47.864331 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9 podName:cb6de1e4-1d85-4764-a2ec-10a023567592 nodeName:}" failed. No retries permitted until 2026-04-20 07:02:48.364318049 +0000 UTC m=+3.184386364 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-b7bw9" (UniqueName: "kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9") pod "network-check-target-r9xht" (UID: "cb6de1e4-1d85-4764-a2ec-10a023567592") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:47.868850 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.868825 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hjmz\" (UniqueName: \"kubernetes.io/projected/3b7c7e9a-ffad-4d5e-a28c-965555ef617c-kube-api-access-9hjmz\") pod \"multus-additional-cni-plugins-nfcgq\" (UID: \"3b7c7e9a-ffad-4d5e-a28c-965555ef617c\") " pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:47.868947 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.868906 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5n5f\" (UniqueName: \"kubernetes.io/projected/dcda05ad-f9cc-4648-90c6-e224049a2518-kube-api-access-m5n5f\") pod \"node-ca-d7hcj\" (UID: \"dcda05ad-f9cc-4648-90c6-e224049a2518\") " pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.869782 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.869763 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmf25\" (UniqueName: \"kubernetes.io/projected/d3e8e640-f571-4cab-a03a-d7cf7eead7e5-kube-api-access-nmf25\") pod \"aws-ebs-csi-driver-node-tpz97\" (UID: \"d3e8e640-f571-4cab-a03a-d7cf7eead7e5\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.937760 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.937724 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:02:47.951703 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.951670 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-j9xjk" Apr 20 07:02:47.960265 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.960244 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" Apr 20 07:02:47.967787 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.967766 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-7zzm4" Apr 20 07:02:47.975381 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.975364 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:02:47.982955 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.982935 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" Apr 20 07:02:47.990458 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.990442 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-d7hcj" Apr 20 07:02:47.994937 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:47.994922 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" Apr 20 07:02:48.019194 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.019172 2543 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:48.251491 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.251404 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:48.251695 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:48.251521 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:48.251695 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:48.251607 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:02:49.251588045 +0000 UTC m=+4.071656365 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:48.329782 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.329744 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8328f0e4_59dc_4020_be13_793917ef3ef0.slice/crio-ca767597daf21a7c1936b5bf66ec9312bddbf8f63c2294938e7d83178a86a5e9 WatchSource:0}: Error finding container ca767597daf21a7c1936b5bf66ec9312bddbf8f63c2294938e7d83178a86a5e9: Status 404 returned error can't find the container with id ca767597daf21a7c1936b5bf66ec9312bddbf8f63c2294938e7d83178a86a5e9 Apr 20 07:02:48.330955 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.330930 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda03fabd4_3e41_402b_bcc3_7e6cdff4e5cf.slice/crio-46e472bb15518e03c16b13c0452d3ed0a5ec240594e74f136b01f18bb85d1ef2 WatchSource:0}: Error finding container 46e472bb15518e03c16b13c0452d3ed0a5ec240594e74f136b01f18bb85d1ef2: Status 404 returned error can't find the container with id 46e472bb15518e03c16b13c0452d3ed0a5ec240594e74f136b01f18bb85d1ef2 Apr 20 07:02:48.331825 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.331799 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13fb38e8_b7f0_4b6a_94e1_f8426903d19f.slice/crio-38f9bf4ccca86ed0adb90ba6a121cd0f2b365535948a65fe2d8767e5dc8da0f3 WatchSource:0}: Error finding container 38f9bf4ccca86ed0adb90ba6a121cd0f2b365535948a65fe2d8767e5dc8da0f3: Status 404 returned error can't find the container with id 38f9bf4ccca86ed0adb90ba6a121cd0f2b365535948a65fe2d8767e5dc8da0f3 Apr 20 07:02:48.334976 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.334942 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcda05ad_f9cc_4648_90c6_e224049a2518.slice/crio-33bbb2bb24f9e3e59a3050cc9446132cfeaa62328f6390bb6332615ee9b59139 WatchSource:0}: Error finding container 33bbb2bb24f9e3e59a3050cc9446132cfeaa62328f6390bb6332615ee9b59139: Status 404 returned error can't find the container with id 33bbb2bb24f9e3e59a3050cc9446132cfeaa62328f6390bb6332615ee9b59139 Apr 20 07:02:48.336477 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.336456 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b7c7e9a_ffad_4d5e_a28c_965555ef617c.slice/crio-e57666bfc21980f1cf393b5d10f92e02ba5b83c75753a4d3dc82d05f36b59721 WatchSource:0}: Error finding container e57666bfc21980f1cf393b5d10f92e02ba5b83c75753a4d3dc82d05f36b59721: Status 404 returned error can't find the container with id e57666bfc21980f1cf393b5d10f92e02ba5b83c75753a4d3dc82d05f36b59721 Apr 20 07:02:48.337116 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.337093 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podddbe431f_4738_4893_8014_96e9ed8fedd5.slice/crio-05e8be32218578fc548a68e7edbf47321a42c50f4dcb69db85f82b917743ec8c WatchSource:0}: Error finding container 05e8be32218578fc548a68e7edbf47321a42c50f4dcb69db85f82b917743ec8c: Status 404 returned error can't find the container with id 05e8be32218578fc548a68e7edbf47321a42c50f4dcb69db85f82b917743ec8c Apr 20 07:02:48.338210 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.337536 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b6dda66_f1f4_4fee_8eb5_1f6447d2d431.slice/crio-10095e57e59cb072db2f8ea035af26894a1367e08668c0bd2ae610aa62c46c5a WatchSource:0}: Error finding container 10095e57e59cb072db2f8ea035af26894a1367e08668c0bd2ae610aa62c46c5a: Status 404 returned error can't find the container with id 10095e57e59cb072db2f8ea035af26894a1367e08668c0bd2ae610aa62c46c5a Apr 20 07:02:48.357840 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:02:48.357815 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3e8e640_f571_4cab_a03a_d7cf7eead7e5.slice/crio-fdcb6741e82c5ece8d3caa05a92bfcd8c2e10f64d04a5da1f4b3660f493c4f22 WatchSource:0}: Error finding container fdcb6741e82c5ece8d3caa05a92bfcd8c2e10f64d04a5da1f4b3660f493c4f22: Status 404 returned error can't find the container with id fdcb6741e82c5ece8d3caa05a92bfcd8c2e10f64d04a5da1f4b3660f493c4f22 Apr 20 07:02:48.453090 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.453064 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:48.453190 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:48.453178 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:48.453190 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:48.453190 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:48.453265 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:48.453199 2543 projected.go:194] Error preparing data for projected volume kube-api-access-b7bw9 for pod openshift-network-diagnostics/network-check-target-r9xht: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:48.453265 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:48.453240 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9 podName:cb6de1e4-1d85-4764-a2ec-10a023567592 nodeName:}" failed. No retries permitted until 2026-04-20 07:02:49.453227101 +0000 UTC m=+4.273295429 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7bw9" (UniqueName: "kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9") pod "network-check-target-r9xht" (UID: "cb6de1e4-1d85-4764-a2ec-10a023567592") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:48.574838 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.573454 2543 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 20 07:02:48.684336 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.684293 2543 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-19 06:57:46 +0000 UTC" deadline="2027-11-01 05:41:13.416253167 +0000 UTC" Apr 20 07:02:48.684336 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.684332 2543 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13438h38m24.731926129s" Apr 20 07:02:48.706051 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.706013 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d7hcj" event={"ID":"dcda05ad-f9cc-4648-90c6-e224049a2518","Type":"ContainerStarted","Data":"33bbb2bb24f9e3e59a3050cc9446132cfeaa62328f6390bb6332615ee9b59139"} Apr 20 07:02:48.709144 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.709097 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"38f9bf4ccca86ed0adb90ba6a121cd0f2b365535948a65fe2d8767e5dc8da0f3"} Apr 20 07:02:48.713268 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.713113 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" event={"ID":"486c974e6666e028518e848e7f47b828","Type":"ContainerStarted","Data":"5948d2762734b330d156765c37afea551709ad2d2b0b80a200ae4392d1e59acf"} Apr 20 07:02:48.714926 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.714900 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerStarted","Data":"e57666bfc21980f1cf393b5d10f92e02ba5b83c75753a4d3dc82d05f36b59721"} Apr 20 07:02:48.716246 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.716212 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cqsc9" event={"ID":"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431","Type":"ContainerStarted","Data":"10095e57e59cb072db2f8ea035af26894a1367e08668c0bd2ae610aa62c46c5a"} Apr 20 07:02:48.718168 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.718138 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" event={"ID":"ddbe431f-4738-4893-8014-96e9ed8fedd5","Type":"ContainerStarted","Data":"05e8be32218578fc548a68e7edbf47321a42c50f4dcb69db85f82b917743ec8c"} Apr 20 07:02:48.720216 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.720181 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9xjk" event={"ID":"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf","Type":"ContainerStarted","Data":"46e472bb15518e03c16b13c0452d3ed0a5ec240594e74f136b01f18bb85d1ef2"} Apr 20 07:02:48.721935 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.721905 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7zzm4" event={"ID":"8328f0e4-59dc-4020-be13-793917ef3ef0","Type":"ContainerStarted","Data":"ca767597daf21a7c1936b5bf66ec9312bddbf8f63c2294938e7d83178a86a5e9"} Apr 20 07:02:48.724162 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:48.724138 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" event={"ID":"d3e8e640-f571-4cab-a03a-d7cf7eead7e5","Type":"ContainerStarted","Data":"fdcb6741e82c5ece8d3caa05a92bfcd8c2e10f64d04a5da1f4b3660f493c4f22"} Apr 20 07:02:49.260104 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.259989 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:49.260104 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.260098 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:49.260318 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.260164 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:02:51.260145887 +0000 UTC m=+6.080214217 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:49.461970 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.461894 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:49.462154 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.462105 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:49.462154 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.462126 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:49.462154 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.462139 2543 projected.go:194] Error preparing data for projected volume kube-api-access-b7bw9 for pod openshift-network-diagnostics/network-check-target-r9xht: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:49.462374 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.462197 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9 podName:cb6de1e4-1d85-4764-a2ec-10a023567592 nodeName:}" failed. No retries permitted until 2026-04-20 07:02:51.462179171 +0000 UTC m=+6.282247505 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7bw9" (UniqueName: "kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9") pod "network-check-target-r9xht" (UID: "cb6de1e4-1d85-4764-a2ec-10a023567592") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:49.699278 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.699203 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:49.699792 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.699344 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:49.699792 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.699453 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:49.699792 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:49.699343 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:02:49.736747 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.735929 2543 generic.go:358] "Generic (PLEG): container finished" podID="9830ed1fbf14c281a00b15700af487ba" containerID="1bcc536103878ae06cbf79f02fdd8e90f1cc05392657acd9b742a545d6138cc7" exitCode=0 Apr 20 07:02:49.736747 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.736696 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" event={"ID":"9830ed1fbf14c281a00b15700af487ba","Type":"ContainerDied","Data":"1bcc536103878ae06cbf79f02fdd8e90f1cc05392657acd9b742a545d6138cc7"} Apr 20 07:02:49.751764 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:49.751714 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-130-105.ec2.internal" podStartSLOduration=3.751695588 podStartE2EDuration="3.751695588s" podCreationTimestamp="2026-04-20 07:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:02:48.72793762 +0000 UTC m=+3.548005958" watchObservedRunningTime="2026-04-20 07:02:49.751695588 +0000 UTC m=+4.571763906" Apr 20 07:02:50.743690 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:50.742975 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" event={"ID":"9830ed1fbf14c281a00b15700af487ba","Type":"ContainerStarted","Data":"4c2fe7ef48939642e21d5374470facb2bf8b0d4a235b6e66275430bfee246044"} Apr 20 07:02:51.276123 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:51.276020 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:51.276308 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.276183 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:51.276308 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.276294 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:02:55.276236515 +0000 UTC m=+10.096304844 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:51.477466 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.477398 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:51.478395 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.478357 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:51.478395 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.478401 2543 projected.go:194] Error preparing data for projected volume kube-api-access-b7bw9 for pod openshift-network-diagnostics/network-check-target-r9xht: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:51.478579 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.478484 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9 podName:cb6de1e4-1d85-4764-a2ec-10a023567592 nodeName:}" failed. No retries permitted until 2026-04-20 07:02:55.47846414 +0000 UTC m=+10.298532472 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7bw9" (UniqueName: "kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9") pod "network-check-target-r9xht" (UID: "cb6de1e4-1d85-4764-a2ec-10a023567592") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:51.478935 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:51.477232 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:51.701670 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:51.701551 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:51.701670 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:51.701592 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:51.701879 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.701702 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:51.701879 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:51.701798 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:02:53.696968 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:53.696938 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:53.697415 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:53.697046 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:02:53.697415 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:53.697323 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:53.697415 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:53.697378 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:55.311495 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:55.311453 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:55.311992 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.311605 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:55.311992 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.311770 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:03:03.311744803 +0000 UTC m=+18.131813142 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:02:55.513286 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:55.513248 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:55.513463 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.513445 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:02:55.513520 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.513470 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:02:55.513520 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.513484 2543 projected.go:194] Error preparing data for projected volume kube-api-access-b7bw9 for pod openshift-network-diagnostics/network-check-target-r9xht: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:55.513596 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.513555 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9 podName:cb6de1e4-1d85-4764-a2ec-10a023567592 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:03.513534764 +0000 UTC m=+18.333603103 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7bw9" (UniqueName: "kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9") pod "network-check-target-r9xht" (UID: "cb6de1e4-1d85-4764-a2ec-10a023567592") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:02:55.698624 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:55.697678 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:55.698624 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.697771 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:02:55.698624 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:55.698466 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:55.703609 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:55.701037 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:57.696784 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:57.696752 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:57.697222 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:57.696764 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:57.697222 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:57.696882 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:57.697222 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:57.696988 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:02:57.965892 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:57.965796 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-130-105.ec2.internal" podStartSLOduration=11.965783185 podStartE2EDuration="11.965783185s" podCreationTimestamp="2026-04-20 07:02:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:02:50.761746301 +0000 UTC m=+5.581814643" watchObservedRunningTime="2026-04-20 07:02:57.965783185 +0000 UTC m=+12.785851522" Apr 20 07:02:57.966288 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:57.966265 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9v756"] Apr 20 07:02:57.969125 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:57.969103 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:57.969230 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:57.969176 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:02:58.031025 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.030985 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/58183227-95cc-42f1-95e7-dc40dc8bf51d-kubelet-config\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.031179 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.031072 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.031179 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.031093 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/58183227-95cc-42f1-95e7-dc40dc8bf51d-dbus\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.132333 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.132302 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/58183227-95cc-42f1-95e7-dc40dc8bf51d-kubelet-config\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.132520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.132361 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.132520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.132389 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/58183227-95cc-42f1-95e7-dc40dc8bf51d-dbus\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.132520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.132434 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/58183227-95cc-42f1-95e7-dc40dc8bf51d-kubelet-config\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.132520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.132459 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/58183227-95cc-42f1-95e7-dc40dc8bf51d-dbus\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.132753 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:58.132517 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:58.132753 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:58.132583 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret podName:58183227-95cc-42f1-95e7-dc40dc8bf51d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:58.632562931 +0000 UTC m=+13.452631266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret") pod "global-pull-secret-syncer-9v756" (UID: "58183227-95cc-42f1-95e7-dc40dc8bf51d") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:58.635297 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:58.635248 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:58.635447 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:58.635374 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:58.635447 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:58.635433 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret podName:58183227-95cc-42f1-95e7-dc40dc8bf51d nodeName:}" failed. No retries permitted until 2026-04-20 07:02:59.635420752 +0000 UTC m=+14.455489067 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret") pod "global-pull-secret-syncer-9v756" (UID: "58183227-95cc-42f1-95e7-dc40dc8bf51d") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:59.642622 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:59.642589 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:59.643096 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:59.642718 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:59.643096 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:59.642774 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret podName:58183227-95cc-42f1-95e7-dc40dc8bf51d nodeName:}" failed. No retries permitted until 2026-04-20 07:03:01.642755503 +0000 UTC m=+16.462823832 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret") pod "global-pull-secret-syncer-9v756" (UID: "58183227-95cc-42f1-95e7-dc40dc8bf51d") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:02:59.697457 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:59.697425 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:02:59.697629 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:59.697425 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:02:59.697629 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:59.697546 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:02:59.697767 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:02:59.697427 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:02:59.697767 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:59.697651 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:02:59.697871 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:02:59.697796 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:01.660699 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:01.660463 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:01.661126 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:01.660586 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:01.661126 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:01.660805 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret podName:58183227-95cc-42f1-95e7-dc40dc8bf51d nodeName:}" failed. No retries permitted until 2026-04-20 07:03:05.660788268 +0000 UTC m=+20.480856600 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret") pod "global-pull-secret-syncer-9v756" (UID: "58183227-95cc-42f1-95e7-dc40dc8bf51d") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:01.697045 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:01.696973 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:01.697208 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:01.697110 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:01.697208 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:01.696974 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:01.697208 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:01.697198 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:01.697366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:01.696972 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:01.697366 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:01.697274 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:02.973794 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:02.973766 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-m98bf"] Apr 20 07:03:03.069805 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.069768 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.073766 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.073741 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 20 07:03:03.073766 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.073755 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n8f2t\"" Apr 20 07:03:03.073766 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.073766 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 20 07:03:03.172892 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.172862 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-hosts-file\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.173079 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.172908 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n589r\" (UniqueName: \"kubernetes.io/projected/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-kube-api-access-n589r\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.173079 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.172940 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-tmp-dir\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.273730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.273664 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-hosts-file\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.273730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.273698 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n589r\" (UniqueName: \"kubernetes.io/projected/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-kube-api-access-n589r\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.273730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.273722 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-tmp-dir\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.273938 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.273787 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-hosts-file\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.273992 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.273978 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-tmp-dir\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.284828 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.284805 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n589r\" (UniqueName: \"kubernetes.io/projected/2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47-kube-api-access-n589r\") pod \"node-resolver-m98bf\" (UID: \"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47\") " pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.374244 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.374212 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:03.374410 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.374335 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:03.374410 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.374399 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:03:19.374380796 +0000 UTC m=+34.194449111 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 20 07:03:03.379207 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.379186 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-m98bf" Apr 20 07:03:03.576428 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.576357 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:03.576591 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.576526 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 20 07:03:03.576591 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.576547 2543 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 20 07:03:03.576591 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.576561 2543 projected.go:194] Error preparing data for projected volume kube-api-access-b7bw9 for pod openshift-network-diagnostics/network-check-target-r9xht: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:03.576765 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.576629 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9 podName:cb6de1e4-1d85-4764-a2ec-10a023567592 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:19.576607999 +0000 UTC m=+34.396676331 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-b7bw9" (UniqueName: "kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9") pod "network-check-target-r9xht" (UID: "cb6de1e4-1d85-4764-a2ec-10a023567592") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 20 07:03:03.697409 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.697374 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:03.697604 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.697502 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:03.697604 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.697508 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:03.697753 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:03.697665 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:03.697753 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.697657 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:03.697841 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:03.697773 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:05.692022 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.691836 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:05.692607 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:05.691969 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:05.692607 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:05.692100 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret podName:58183227-95cc-42f1-95e7-dc40dc8bf51d nodeName:}" failed. No retries permitted until 2026-04-20 07:03:13.69208296 +0000 UTC m=+28.512151274 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret") pod "global-pull-secret-syncer-9v756" (UID: "58183227-95cc-42f1-95e7-dc40dc8bf51d") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:05.697669 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.697631 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:05.697787 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.697741 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:05.697858 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:05.697745 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:05.697988 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:05.697858 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:05.697988 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.697876 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:05.698071 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:05.698010 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:05.767186 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.767153 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" event={"ID":"ddbe431f-4738-4893-8014-96e9ed8fedd5","Type":"ContainerStarted","Data":"a873df509670aa18203f59c72fd212647ace3c92cf121926a09070442fb79f0f"} Apr 20 07:03:05.768605 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.768571 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-j9xjk" event={"ID":"a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf","Type":"ContainerStarted","Data":"cbb67b10ff9677f24b054cf6daeb343396cef9fcd1fe0787807f09e237ed1197"} Apr 20 07:03:05.770148 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.770124 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m98bf" event={"ID":"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47","Type":"ContainerStarted","Data":"d36bb801711d2bcbb8d3cea86d346674984ba7b6d1aa90b46bef49ab2e80add8"} Apr 20 07:03:05.770232 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.770160 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-m98bf" event={"ID":"2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47","Type":"ContainerStarted","Data":"04c69f9a0afa61d28201ef5a2a808e944d78807b2e1e2e7cb6626fae2ec6e024"} Apr 20 07:03:05.771735 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.771711 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" event={"ID":"d3e8e640-f571-4cab-a03a-d7cf7eead7e5","Type":"ContainerStarted","Data":"fe1e36b1be40e56eae209bac9504cbb552da1086551bdd327082502cd2bae5d8"} Apr 20 07:03:05.773053 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.773031 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-d7hcj" event={"ID":"dcda05ad-f9cc-4648-90c6-e224049a2518","Type":"ContainerStarted","Data":"a36e89e4d51efb38051ffeb435968499b2ae925638ce956ab86f95a285e3d439"} Apr 20 07:03:05.774735 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.774716 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:03:05.775056 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.775036 2543 generic.go:358] "Generic (PLEG): container finished" podID="13fb38e8-b7f0-4b6a-94e1-f8426903d19f" containerID="4f3cefbee33592ad676e113f7f7e73e9578d2c699e0486dd411c7697cd48af6d" exitCode=1 Apr 20 07:03:05.775116 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.775095 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"b3eea6d4946f627715a385d6ad96192e9024a3f5e27341da64d74b9da1bede76"} Apr 20 07:03:05.775155 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.775115 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerDied","Data":"4f3cefbee33592ad676e113f7f7e73e9578d2c699e0486dd411c7697cd48af6d"} Apr 20 07:03:05.775155 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.775130 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"ed493b372a6a9dd3170a73495f21ef508451b2dda64b4fc5b0c959a01926ab14"} Apr 20 07:03:05.776320 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.776293 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerStarted","Data":"9d9c7206260445892e440c85aeaec1c008b1dd6557a64dac678807f716bd0167"} Apr 20 07:03:05.777406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.777388 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-cqsc9" event={"ID":"9b6dda66-f1f4-4fee-8eb5-1f6447d2d431","Type":"ContainerStarted","Data":"c96b3c56b521577b5a5e83bc8cf36fe567dce0d0879ec4b3e5732a9633f33017"} Apr 20 07:03:05.824913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.824867 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-6mpqd" podStartSLOduration=4.057622299 podStartE2EDuration="20.824852813s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.356822979 +0000 UTC m=+3.176891307" lastFinishedPulling="2026-04-20 07:03:05.124053501 +0000 UTC m=+19.944121821" observedRunningTime="2026-04-20 07:03:05.795273245 +0000 UTC m=+20.615341582" watchObservedRunningTime="2026-04-20 07:03:05.824852813 +0000 UTC m=+20.644921129" Apr 20 07:03:05.845141 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.845096 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-d7hcj" podStartSLOduration=8.747602189 podStartE2EDuration="20.845080976s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.336253644 +0000 UTC m=+3.156321964" lastFinishedPulling="2026-04-20 07:03:00.43373242 +0000 UTC m=+15.253800751" observedRunningTime="2026-04-20 07:03:05.844570889 +0000 UTC m=+20.664639226" watchObservedRunningTime="2026-04-20 07:03:05.845080976 +0000 UTC m=+20.665149355" Apr 20 07:03:05.867193 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.867148 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-m98bf" podStartSLOduration=3.867135622 podStartE2EDuration="3.867135622s" podCreationTimestamp="2026-04-20 07:03:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:05.866909627 +0000 UTC m=+20.686977972" watchObservedRunningTime="2026-04-20 07:03:05.867135622 +0000 UTC m=+20.687203958" Apr 20 07:03:05.896765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.895800 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-cqsc9" podStartSLOduration=4.498394157 podStartE2EDuration="20.895782933s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.357000556 +0000 UTC m=+3.177068870" lastFinishedPulling="2026-04-20 07:03:04.754389331 +0000 UTC m=+19.574457646" observedRunningTime="2026-04-20 07:03:05.895115182 +0000 UTC m=+20.715183535" watchObservedRunningTime="2026-04-20 07:03:05.895782933 +0000 UTC m=+20.715851272" Apr 20 07:03:05.918088 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:05.918053 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-j9xjk" podStartSLOduration=4.105402357 podStartE2EDuration="20.918042308s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.333425393 +0000 UTC m=+3.153493710" lastFinishedPulling="2026-04-20 07:03:05.146065346 +0000 UTC m=+19.966133661" observedRunningTime="2026-04-20 07:03:05.917503919 +0000 UTC m=+20.737572256" watchObservedRunningTime="2026-04-20 07:03:05.918042308 +0000 UTC m=+20.738110644" Apr 20 07:03:06.591212 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.591188 2543 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 20 07:03:06.707966 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.707811 2543 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-20T07:03:06.591203562Z","UUID":"c3d3552c-6911-473e-a9c5-1f979da2ccd9","Handler":null,"Name":"","Endpoint":""} Apr 20 07:03:06.710432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.710405 2543 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 20 07:03:06.710432 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.710433 2543 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 20 07:03:06.780066 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.780033 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-7zzm4" event={"ID":"8328f0e4-59dc-4020-be13-793917ef3ef0","Type":"ContainerStarted","Data":"d1ffa7698a5d19adbc1081f11f69a75f211e7a12b0789268c0c19064243d4672"} Apr 20 07:03:06.781597 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.781574 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" event={"ID":"d3e8e640-f571-4cab-a03a-d7cf7eead7e5","Type":"ContainerStarted","Data":"3ff0aeb68e209040f99049d6758edd04d63591c356add35b56d4edc3d2c20eb3"} Apr 20 07:03:06.783628 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.783613 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:03:06.783952 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.783928 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"4e84650b51d6141f0b78e919cea526cbfc0a5408a9d707e3c98c96daf27ad027"} Apr 20 07:03:06.783952 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.783951 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"a03906c419567e2ed91e7f376bb3fb0e2446dd49eedf887589c2479e6c6e17e4"} Apr 20 07:03:06.784069 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.783964 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"3a6210e4685b0e90cff462de89c904c6815c63aca2f1dba7f80ebde107a4c46c"} Apr 20 07:03:06.785073 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.785054 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b7c7e9a-ffad-4d5e-a28c-965555ef617c" containerID="9d9c7206260445892e440c85aeaec1c008b1dd6557a64dac678807f716bd0167" exitCode=0 Apr 20 07:03:06.785170 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.785150 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerDied","Data":"9d9c7206260445892e440c85aeaec1c008b1dd6557a64dac678807f716bd0167"} Apr 20 07:03:06.803883 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:06.803842 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-7zzm4" podStartSLOduration=5.058104677 podStartE2EDuration="21.803830448s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.332042702 +0000 UTC m=+3.152111031" lastFinishedPulling="2026-04-20 07:03:05.077768487 +0000 UTC m=+19.897836802" observedRunningTime="2026-04-20 07:03:06.803750261 +0000 UTC m=+21.623818598" watchObservedRunningTime="2026-04-20 07:03:06.803830448 +0000 UTC m=+21.623898781" Apr 20 07:03:07.697711 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:07.697689 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:07.697854 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:07.697689 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:07.697854 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:07.697815 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:07.697969 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:07.697880 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:07.697969 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:07.697689 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:07.698066 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:07.697977 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:08.791080 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:08.791039 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" event={"ID":"d3e8e640-f571-4cab-a03a-d7cf7eead7e5","Type":"ContainerStarted","Data":"33e4e0d2d76eaaa93cfbd1cf7680bdc3debf6036c1551cb1ef9dc92577ffe578"} Apr 20 07:03:08.793951 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:08.793919 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:03:08.794353 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:08.794322 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"5833f5c74e5443ae37b4b6732873175e3b5fd3041ad5eac9e2b2a666074f4a33"} Apr 20 07:03:08.821194 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:08.821154 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-tpz97" podStartSLOduration=4.496715512 podStartE2EDuration="23.821139884s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.360009746 +0000 UTC m=+3.180078067" lastFinishedPulling="2026-04-20 07:03:07.684434109 +0000 UTC m=+22.504502439" observedRunningTime="2026-04-20 07:03:08.820665577 +0000 UTC m=+23.640733916" watchObservedRunningTime="2026-04-20 07:03:08.821139884 +0000 UTC m=+23.641208222" Apr 20 07:03:09.559051 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.559011 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:03:09.559907 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.559880 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:03:09.697298 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.697226 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:09.697298 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.697248 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:09.697532 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.697226 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:09.697532 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:09.697354 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:09.697532 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:09.697458 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:09.697689 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:09.697539 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:09.796548 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.796498 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:03:09.797026 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:09.796840 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-cqsc9" Apr 20 07:03:11.697585 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.697395 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:11.697954 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.697428 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:11.697954 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:11.697790 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:11.697954 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.697453 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:11.697954 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:11.697685 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:11.697954 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:11.697882 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:11.802143 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.802124 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:03:11.802458 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.802430 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"8ce94c7de6c24ac4cd49e73747b1506d7008aad8939b1fbec948cacd34c74095"} Apr 20 07:03:11.802694 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.802673 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:03:11.802804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.802768 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:03:11.802804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.802785 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:03:11.802913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.802885 2543 scope.go:117] "RemoveContainer" containerID="4f3cefbee33592ad676e113f7f7e73e9578d2c699e0486dd411c7697cd48af6d" Apr 20 07:03:11.804653 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.804615 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b7c7e9a-ffad-4d5e-a28c-965555ef617c" containerID="876003f814fa21bda4cfe1bfd3dccc1ea70a0e9da9f878528c6f7c342f256d5b" exitCode=0 Apr 20 07:03:11.804754 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.804673 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerDied","Data":"876003f814fa21bda4cfe1bfd3dccc1ea70a0e9da9f878528c6f7c342f256d5b"} Apr 20 07:03:11.817355 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.817330 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:03:11.817683 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:11.817667 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:03:12.808334 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:12.808254 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b7c7e9a-ffad-4d5e-a28c-965555ef617c" containerID="842fa1edcdbade8978411337271c748ed8d7f5677c7300a97b899b1d9812f49b" exitCode=0 Apr 20 07:03:12.808718 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:12.808340 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerDied","Data":"842fa1edcdbade8978411337271c748ed8d7f5677c7300a97b899b1d9812f49b"} Apr 20 07:03:12.811714 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:12.811697 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:03:12.812014 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:12.811995 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" event={"ID":"13fb38e8-b7f0-4b6a-94e1-f8426903d19f","Type":"ContainerStarted","Data":"b265292b4dd27b65f1c55bd61ade8b6a3c950d21dccadcfcaa48fc6cdb2e8442"} Apr 20 07:03:13.387385 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.387094 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" podStartSLOduration=11.548552231 podStartE2EDuration="28.387072413s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.334941361 +0000 UTC m=+3.155009680" lastFinishedPulling="2026-04-20 07:03:05.173461536 +0000 UTC m=+19.993529862" observedRunningTime="2026-04-20 07:03:12.903583815 +0000 UTC m=+27.723652152" watchObservedRunningTime="2026-04-20 07:03:13.387072413 +0000 UTC m=+28.207140752" Apr 20 07:03:13.387974 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.387948 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9v756"] Apr 20 07:03:13.388093 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.388078 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:13.388247 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:13.388218 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:13.390415 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.390375 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r9xht"] Apr 20 07:03:13.390698 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.390677 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:13.390806 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:13.390777 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:13.391304 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.391075 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wgd5x"] Apr 20 07:03:13.391304 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.391174 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:13.391304 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:13.391264 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:13.753722 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.753683 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:13.753891 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:13.753821 2543 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:13.753891 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:13.753878 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret podName:58183227-95cc-42f1-95e7-dc40dc8bf51d nodeName:}" failed. No retries permitted until 2026-04-20 07:03:29.753861164 +0000 UTC m=+44.573929490 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret") pod "global-pull-secret-syncer-9v756" (UID: "58183227-95cc-42f1-95e7-dc40dc8bf51d") : object "kube-system"/"original-pull-secret" not registered Apr 20 07:03:13.816171 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.816144 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b7c7e9a-ffad-4d5e-a28c-965555ef617c" containerID="2f81bfe8959a82e6948edcdde690508d3daea548f5985b81d5d1c65099d0276b" exitCode=0 Apr 20 07:03:13.816611 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:13.816229 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerDied","Data":"2f81bfe8959a82e6948edcdde690508d3daea548f5985b81d5d1c65099d0276b"} Apr 20 07:03:14.697967 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:14.697058 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:14.697967 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:14.697180 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:14.697967 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:14.697258 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:14.697967 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:14.697358 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:15.700169 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:15.700129 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:15.700848 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:15.700247 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:16.697261 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:16.697228 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:16.697424 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:16.697236 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:16.697424 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:16.697343 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-r9xht" podUID="cb6de1e4-1d85-4764-a2ec-10a023567592" Apr 20 07:03:16.697527 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:16.697444 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wgd5x" podUID="4ffbcc28-a10c-467a-a9e7-e31e20e4975e" Apr 20 07:03:17.696802 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:17.696770 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:17.697402 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:17.696893 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-9v756" podUID="58183227-95cc-42f1-95e7-dc40dc8bf51d" Apr 20 07:03:18.493586 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.493555 2543 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-130-105.ec2.internal" event="NodeReady" Apr 20 07:03:18.493866 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.493725 2543 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 20 07:03:18.555703 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.555673 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-f8bbf6698-rjf5f"] Apr 20 07:03:18.558101 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.558077 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.567261 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.567239 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-w2xpz\"" Apr 20 07:03:18.567367 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.567305 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 20 07:03:18.568190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.568174 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 20 07:03:18.577438 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.577415 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 20 07:03:18.579563 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.579543 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f8bbf6698-rjf5f"] Apr 20 07:03:18.582608 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.582590 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 20 07:03:18.586577 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.586557 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-5kxb2"] Apr 20 07:03:18.588693 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.588673 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.595754 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.595711 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-sw57v"] Apr 20 07:03:18.595887 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.595869 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 20 07:03:18.597101 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.597075 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 20 07:03:18.599712 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.597852 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:18.605441 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.605410 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 20 07:03:18.609411 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.609391 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 20 07:03:18.609712 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.609694 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-75ldp\"" Apr 20 07:03:18.609804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.609785 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 20 07:03:18.610517 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.610498 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-vhcpq\"" Apr 20 07:03:18.616266 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.616233 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5kxb2"] Apr 20 07:03:18.645556 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.645522 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sw57v"] Apr 20 07:03:18.689948 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.689913 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.689948 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.689954 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-bound-sa-token\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.689979 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjzwq\" (UniqueName: \"kubernetes.io/projected/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-kube-api-access-mjzwq\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690046 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb234c29-ad1a-4e18-b6ff-cb4363028abd-ca-trust-extracted\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690065 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-tmp-dir\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690131 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-trusted-ca\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690166 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-certificates\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690189 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-installation-pull-secrets\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690223 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690212 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddfjv\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-kube-api-access-ddfjv\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690515 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690243 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690515 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690265 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:18.690515 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690288 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cfr4\" (UniqueName: \"kubernetes.io/projected/7a3a9194-e692-46ef-949c-604a76b49aad-kube-api-access-9cfr4\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:18.690515 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690352 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-image-registry-private-configuration\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.690515 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.690377 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-config-volume\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.696918 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.696896 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:18.697304 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.696904 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:18.700598 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.700576 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 20 07:03:18.700960 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.700941 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 20 07:03:18.701121 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.701088 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 20 07:03:18.701121 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.701115 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mf5sv\"" Apr 20 07:03:18.701263 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.701225 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-g4l9z\"" Apr 20 07:03:18.791269 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791180 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-certificates\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791269 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791228 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-installation-pull-secrets\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791269 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791254 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddfjv\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-kube-api-access-ddfjv\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791508 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791285 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791508 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.791384 2543 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:18.791508 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791393 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:18.791508 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791435 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9cfr4\" (UniqueName: \"kubernetes.io/projected/7a3a9194-e692-46ef-949c-604a76b49aad-kube-api-access-9cfr4\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:18.791508 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.791403 2543 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f8bbf6698-rjf5f: secret "image-registry-tls" not found Apr 20 07:03:18.791508 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791490 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-image-registry-private-configuration\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.791522 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls podName:eb234c29-ad1a-4e18-b6ff-cb4363028abd nodeName:}" failed. No retries permitted until 2026-04-20 07:03:19.291506166 +0000 UTC m=+34.111574481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls") pod "image-registry-f8bbf6698-rjf5f" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd") : secret "image-registry-tls" not found Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791537 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-config-volume\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791570 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791589 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-bound-sa-token\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791605 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mjzwq\" (UniqueName: \"kubernetes.io/projected/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-kube-api-access-mjzwq\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791629 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb234c29-ad1a-4e18-b6ff-cb4363028abd-ca-trust-extracted\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791661 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-tmp-dir\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.791725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.791710 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-trusted-ca\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.792190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.792014 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-certificates\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.792190 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.791460 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:18.792190 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.792108 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:18.792190 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.792125 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert podName:7a3a9194-e692-46ef-949c-604a76b49aad nodeName:}" failed. No retries permitted until 2026-04-20 07:03:19.292103249 +0000 UTC m=+34.112171579 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert") pod "ingress-canary-sw57v" (UID: "7a3a9194-e692-46ef-949c-604a76b49aad") : secret "canary-serving-cert" not found Apr 20 07:03:18.792190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.792136 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-config-volume\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.792190 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:18.792169 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls podName:ff2f3e47-aed8-4328-8fce-7d39dbc0d939 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:19.292153156 +0000 UTC m=+34.112221486 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls") pod "dns-default-5kxb2" (UID: "ff2f3e47-aed8-4328-8fce-7d39dbc0d939") : secret "dns-default-metrics-tls" not found Apr 20 07:03:18.792577 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.792360 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-tmp-dir\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.792577 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.792541 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb234c29-ad1a-4e18-b6ff-cb4363028abd-ca-trust-extracted\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.792577 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.792546 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-trusted-ca\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.796295 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.796172 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-image-registry-private-configuration\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.796403 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.796178 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-installation-pull-secrets\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.806514 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.806491 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cfr4\" (UniqueName: \"kubernetes.io/projected/7a3a9194-e692-46ef-949c-604a76b49aad-kube-api-access-9cfr4\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:18.807981 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.807947 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-bound-sa-token\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:18.808151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.808132 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjzwq\" (UniqueName: \"kubernetes.io/projected/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-kube-api-access-mjzwq\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:18.812257 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:18.812225 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddfjv\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-kube-api-access-ddfjv\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:19.295853 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.295807 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:19.296031 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.295918 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:19.296031 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.295946 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:19.296031 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.295963 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:19.296159 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.296036 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls podName:ff2f3e47-aed8-4328-8fce-7d39dbc0d939 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.296016231 +0000 UTC m=+35.116084554 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls") pod "dns-default-5kxb2" (UID: "ff2f3e47-aed8-4328-8fce-7d39dbc0d939") : secret "dns-default-metrics-tls" not found Apr 20 07:03:19.296159 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.296062 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:19.296159 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.296098 2543 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:19.296159 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.296122 2543 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f8bbf6698-rjf5f: secret "image-registry-tls" not found Apr 20 07:03:19.296311 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.296111 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert podName:7a3a9194-e692-46ef-949c-604a76b49aad nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.296097784 +0000 UTC m=+35.116166102 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert") pod "ingress-canary-sw57v" (UID: "7a3a9194-e692-46ef-949c-604a76b49aad") : secret "canary-serving-cert" not found Apr 20 07:03:19.296311 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.296197 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls podName:eb234c29-ad1a-4e18-b6ff-cb4363028abd nodeName:}" failed. No retries permitted until 2026-04-20 07:03:20.296173816 +0000 UTC m=+35.116242136 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls") pod "image-registry-f8bbf6698-rjf5f" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd") : secret "image-registry-tls" not found Apr 20 07:03:19.397050 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.397024 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:19.397236 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.397109 2543 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 20 07:03:19.397236 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:19.397168 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs podName:4ffbcc28-a10c-467a-a9e7-e31e20e4975e nodeName:}" failed. No retries permitted until 2026-04-20 07:03:51.397153562 +0000 UTC m=+66.217221877 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs") pod "network-metrics-daemon-wgd5x" (UID: "4ffbcc28-a10c-467a-a9e7-e31e20e4975e") : secret "metrics-daemon-secret" not found Apr 20 07:03:19.598626 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.598542 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:19.601556 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.601529 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7bw9\" (UniqueName: \"kubernetes.io/projected/cb6de1e4-1d85-4764-a2ec-10a023567592-kube-api-access-b7bw9\") pod \"network-check-target-r9xht\" (UID: \"cb6de1e4-1d85-4764-a2ec-10a023567592\") " pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:19.614440 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.614413 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:19.697498 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.697462 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:19.700448 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.700426 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 20 07:03:19.985130 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:19.985099 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-r9xht"] Apr 20 07:03:20.024176 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:20.024142 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb6de1e4_1d85_4764_a2ec_10a023567592.slice/crio-9af4a062040d77bbad0462be7b16b0732a0ef686141512cb98f30e1ff24232d5 WatchSource:0}: Error finding container 9af4a062040d77bbad0462be7b16b0732a0ef686141512cb98f30e1ff24232d5: Status 404 returned error can't find the container with id 9af4a062040d77bbad0462be7b16b0732a0ef686141512cb98f30e1ff24232d5 Apr 20 07:03:20.305783 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:20.305758 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:20.305898 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:20.305845 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:20.305898 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:20.305874 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:20.306064 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.305917 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:20.306064 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.305993 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls podName:ff2f3e47-aed8-4328-8fce-7d39dbc0d939 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:22.305967949 +0000 UTC m=+37.126036279 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls") pod "dns-default-5kxb2" (UID: "ff2f3e47-aed8-4328-8fce-7d39dbc0d939") : secret "dns-default-metrics-tls" not found Apr 20 07:03:20.306064 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.306032 2543 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:20.306064 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.306063 2543 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f8bbf6698-rjf5f: secret "image-registry-tls" not found Apr 20 07:03:20.306226 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.306126 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls podName:eb234c29-ad1a-4e18-b6ff-cb4363028abd nodeName:}" failed. No retries permitted until 2026-04-20 07:03:22.306104239 +0000 UTC m=+37.126172566 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls") pod "image-registry-f8bbf6698-rjf5f" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd") : secret "image-registry-tls" not found Apr 20 07:03:20.306601 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.306585 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:20.306795 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:20.306698 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert podName:7a3a9194-e692-46ef-949c-604a76b49aad nodeName:}" failed. No retries permitted until 2026-04-20 07:03:22.3066248 +0000 UTC m=+37.126693121 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert") pod "ingress-canary-sw57v" (UID: "7a3a9194-e692-46ef-949c-604a76b49aad") : secret "canary-serving-cert" not found Apr 20 07:03:20.831387 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:20.831349 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r9xht" event={"ID":"cb6de1e4-1d85-4764-a2ec-10a023567592","Type":"ContainerStarted","Data":"9af4a062040d77bbad0462be7b16b0732a0ef686141512cb98f30e1ff24232d5"} Apr 20 07:03:20.833963 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:20.833939 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b7c7e9a-ffad-4d5e-a28c-965555ef617c" containerID="4885462688ec6af267514f972f865479642ac2ee5839d4210bc8b0e7b8bf2a4d" exitCode=0 Apr 20 07:03:20.834110 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:20.833995 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerDied","Data":"4885462688ec6af267514f972f865479642ac2ee5839d4210bc8b0e7b8bf2a4d"} Apr 20 07:03:21.839042 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:21.838808 2543 generic.go:358] "Generic (PLEG): container finished" podID="3b7c7e9a-ffad-4d5e-a28c-965555ef617c" containerID="f52d5066dda64dae3c3441383748c5bd86d7d77eda8474c54d789d9b1af88227" exitCode=0 Apr 20 07:03:21.839042 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:21.838888 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerDied","Data":"f52d5066dda64dae3c3441383748c5bd86d7d77eda8474c54d789d9b1af88227"} Apr 20 07:03:22.321688 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.321631 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:22.321688 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.321694 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.321743 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321790 2543 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321811 2543 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f8bbf6698-rjf5f: secret "image-registry-tls" not found Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321864 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls podName:eb234c29-ad1a-4e18-b6ff-cb4363028abd nodeName:}" failed. No retries permitted until 2026-04-20 07:03:26.321844984 +0000 UTC m=+41.141913301 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls") pod "image-registry-f8bbf6698-rjf5f" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd") : secret "image-registry-tls" not found Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321864 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321872 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:22.321931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321924 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert podName:7a3a9194-e692-46ef-949c-604a76b49aad nodeName:}" failed. No retries permitted until 2026-04-20 07:03:26.321907706 +0000 UTC m=+41.141976026 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert") pod "ingress-canary-sw57v" (UID: "7a3a9194-e692-46ef-949c-604a76b49aad") : secret "canary-serving-cert" not found Apr 20 07:03:22.322197 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:22.321958 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls podName:ff2f3e47-aed8-4328-8fce-7d39dbc0d939 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:26.32194403 +0000 UTC m=+41.142012350 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls") pod "dns-default-5kxb2" (UID: "ff2f3e47-aed8-4328-8fce-7d39dbc0d939") : secret "dns-default-metrics-tls" not found Apr 20 07:03:22.684840 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.684814 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp"] Apr 20 07:03:22.712849 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.712826 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp"] Apr 20 07:03:22.712972 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.712927 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" Apr 20 07:03:22.727381 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.727356 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 20 07:03:22.727485 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.727356 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 20 07:03:22.728169 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.728152 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-rwlwq\"" Apr 20 07:03:22.826619 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.826595 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsh78\" (UniqueName: \"kubernetes.io/projected/72dd3d8d-7973-437e-b710-621b728bcecb-kube-api-access-jsh78\") pod \"migrator-74bb7799d9-ftkkp\" (UID: \"72dd3d8d-7973-437e-b710-621b728bcecb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" Apr 20 07:03:22.927528 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.927504 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jsh78\" (UniqueName: \"kubernetes.io/projected/72dd3d8d-7973-437e-b710-621b728bcecb-kube-api-access-jsh78\") pod \"migrator-74bb7799d9-ftkkp\" (UID: \"72dd3d8d-7973-437e-b710-621b728bcecb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" Apr 20 07:03:22.943326 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:22.943270 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsh78\" (UniqueName: \"kubernetes.io/projected/72dd3d8d-7973-437e-b710-621b728bcecb-kube-api-access-jsh78\") pod \"migrator-74bb7799d9-ftkkp\" (UID: \"72dd3d8d-7973-437e-b710-621b728bcecb\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" Apr 20 07:03:23.021313 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.021278 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" Apr 20 07:03:23.266875 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.266688 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp"] Apr 20 07:03:23.270016 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:23.269988 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72dd3d8d_7973_437e_b710_621b728bcecb.slice/crio-6a80fb5ad948122f510c12b1c6d8a37956cda132c4431ec6048fa5eb9b9283ad WatchSource:0}: Error finding container 6a80fb5ad948122f510c12b1c6d8a37956cda132c4431ec6048fa5eb9b9283ad: Status 404 returned error can't find the container with id 6a80fb5ad948122f510c12b1c6d8a37956cda132c4431ec6048fa5eb9b9283ad Apr 20 07:03:23.843373 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.843338 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-r9xht" event={"ID":"cb6de1e4-1d85-4764-a2ec-10a023567592","Type":"ContainerStarted","Data":"0405d44f326257a232081d73fc813d005aa511b2fe2f929ed7473132677990ac"} Apr 20 07:03:23.843545 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.843469 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:23.844329 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.844304 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" event={"ID":"72dd3d8d-7973-437e-b710-621b728bcecb","Type":"ContainerStarted","Data":"6a80fb5ad948122f510c12b1c6d8a37956cda132c4431ec6048fa5eb9b9283ad"} Apr 20 07:03:23.846806 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.846784 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" event={"ID":"3b7c7e9a-ffad-4d5e-a28c-965555ef617c","Type":"ContainerStarted","Data":"e09b4121ae5950954a3f023dfc22a5edb1137d6bbcb6ab835f078112337a8352"} Apr 20 07:03:23.929451 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.929408 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-nfcgq" podStartSLOduration=7.236029584 podStartE2EDuration="38.929395599s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:02:48.356818295 +0000 UTC m=+3.176886613" lastFinishedPulling="2026-04-20 07:03:20.05018431 +0000 UTC m=+34.870252628" observedRunningTime="2026-04-20 07:03:23.929054775 +0000 UTC m=+38.749123111" watchObservedRunningTime="2026-04-20 07:03:23.929395599 +0000 UTC m=+38.749463937" Apr 20 07:03:23.930043 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:23.929513 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-r9xht" podStartSLOduration=35.856593726 podStartE2EDuration="38.92950844s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:03:20.029124533 +0000 UTC m=+34.849192849" lastFinishedPulling="2026-04-20 07:03:23.102039246 +0000 UTC m=+37.922107563" observedRunningTime="2026-04-20 07:03:23.877054369 +0000 UTC m=+38.697122706" watchObservedRunningTime="2026-04-20 07:03:23.92950844 +0000 UTC m=+38.749576777" Apr 20 07:03:24.029137 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:24.029100 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m98bf_2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47/dns-node-resolver/0.log" Apr 20 07:03:25.016864 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.016840 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d7hcj_dcda05ad-f9cc-4648-90c6-e224049a2518/node-ca/0.log" Apr 20 07:03:25.741659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.741609 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sdjsh"] Apr 20 07:03:25.759481 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.759454 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.764868 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.764847 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 20 07:03:25.775593 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.775572 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sdjsh"] Apr 20 07:03:25.775739 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.775721 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 20 07:03:25.775919 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.775905 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-95tzg\"" Apr 20 07:03:25.775975 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.775902 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 20 07:03:25.775975 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.775947 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 20 07:03:25.846762 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.846734 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cb0ff652-51b6-4444-a6f3-cb702610472d-signing-key\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.846935 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.846784 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cb0ff652-51b6-4444-a6f3-cb702610472d-signing-cabundle\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.846935 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.846831 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67hz2\" (UniqueName: \"kubernetes.io/projected/cb0ff652-51b6-4444-a6f3-cb702610472d-kube-api-access-67hz2\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.852487 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.852458 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" event={"ID":"72dd3d8d-7973-437e-b710-621b728bcecb","Type":"ContainerStarted","Data":"a53d86a4c3524bef08fecab448dbae29d4fa34787620f9630b364fddc872dfe9"} Apr 20 07:03:25.852601 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.852496 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" event={"ID":"72dd3d8d-7973-437e-b710-621b728bcecb","Type":"ContainerStarted","Data":"e60c6fe08c0d7794e2d0e40dd254d2f15fae3fb8ef4197ca0a55000b2fea3569"} Apr 20 07:03:25.890919 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.890880 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-ftkkp" podStartSLOduration=2.052751432 podStartE2EDuration="3.890868363s" podCreationTimestamp="2026-04-20 07:03:22 +0000 UTC" firstStartedPulling="2026-04-20 07:03:23.271770854 +0000 UTC m=+38.091839169" lastFinishedPulling="2026-04-20 07:03:25.109887769 +0000 UTC m=+39.929956100" observedRunningTime="2026-04-20 07:03:25.889624919 +0000 UTC m=+40.709693256" watchObservedRunningTime="2026-04-20 07:03:25.890868363 +0000 UTC m=+40.710936700" Apr 20 07:03:25.947804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.947778 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cb0ff652-51b6-4444-a6f3-cb702610472d-signing-cabundle\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.947907 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.947845 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-67hz2\" (UniqueName: \"kubernetes.io/projected/cb0ff652-51b6-4444-a6f3-cb702610472d-kube-api-access-67hz2\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.947951 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.947916 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cb0ff652-51b6-4444-a6f3-cb702610472d-signing-key\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.951095 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.951075 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/cb0ff652-51b6-4444-a6f3-cb702610472d-signing-key\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.954387 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.954369 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/cb0ff652-51b6-4444-a6f3-cb702610472d-signing-cabundle\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:25.958983 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:25.958954 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-67hz2\" (UniqueName: \"kubernetes.io/projected/cb0ff652-51b6-4444-a6f3-cb702610472d-kube-api-access-67hz2\") pod \"service-ca-865cb79987-sdjsh\" (UID: \"cb0ff652-51b6-4444-a6f3-cb702610472d\") " pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:26.068512 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:26.068436 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-sdjsh" Apr 20 07:03:26.193689 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:26.193663 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-sdjsh"] Apr 20 07:03:26.204134 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:26.204107 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb0ff652_51b6_4444_a6f3_cb702610472d.slice/crio-b27d3b11474224b6976afeddad632b00159285b9389d3478c3ff2073cf01c2e6 WatchSource:0}: Error finding container b27d3b11474224b6976afeddad632b00159285b9389d3478c3ff2073cf01c2e6: Status 404 returned error can't find the container with id b27d3b11474224b6976afeddad632b00159285b9389d3478c3ff2073cf01c2e6 Apr 20 07:03:26.351827 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:26.351753 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:26.351827 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:26.351789 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:26.352001 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.351892 2543 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 20 07:03:26.352001 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.351900 2543 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 20 07:03:26.352001 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.351919 2543 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-f8bbf6698-rjf5f: secret "image-registry-tls" not found Apr 20 07:03:26.352001 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:26.351926 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:26.352001 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.351938 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert podName:7a3a9194-e692-46ef-949c-604a76b49aad nodeName:}" failed. No retries permitted until 2026-04-20 07:03:34.351924644 +0000 UTC m=+49.171992964 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert") pod "ingress-canary-sw57v" (UID: "7a3a9194-e692-46ef-949c-604a76b49aad") : secret "canary-serving-cert" not found Apr 20 07:03:26.352001 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.351972 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls podName:eb234c29-ad1a-4e18-b6ff-cb4363028abd nodeName:}" failed. No retries permitted until 2026-04-20 07:03:34.351958162 +0000 UTC m=+49.172026477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls") pod "image-registry-f8bbf6698-rjf5f" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd") : secret "image-registry-tls" not found Apr 20 07:03:26.352199 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.352009 2543 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 20 07:03:26.352199 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:03:26.352043 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls podName:ff2f3e47-aed8-4328-8fce-7d39dbc0d939 nodeName:}" failed. No retries permitted until 2026-04-20 07:03:34.352033958 +0000 UTC m=+49.172102272 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls") pod "dns-default-5kxb2" (UID: "ff2f3e47-aed8-4328-8fce-7d39dbc0d939") : secret "dns-default-metrics-tls" not found Apr 20 07:03:26.855966 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:26.855930 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sdjsh" event={"ID":"cb0ff652-51b6-4444-a6f3-cb702610472d","Type":"ContainerStarted","Data":"b27d3b11474224b6976afeddad632b00159285b9389d3478c3ff2073cf01c2e6"} Apr 20 07:03:28.861822 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:28.861781 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-sdjsh" event={"ID":"cb0ff652-51b6-4444-a6f3-cb702610472d","Type":"ContainerStarted","Data":"1c79c6027481290d41df2373243bf403be5159dbf23fccb7aa1384f1311673e0"} Apr 20 07:03:29.776929 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:29.776832 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:29.779610 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:29.779578 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/58183227-95cc-42f1-95e7-dc40dc8bf51d-original-pull-secret\") pod \"global-pull-secret-syncer-9v756\" (UID: \"58183227-95cc-42f1-95e7-dc40dc8bf51d\") " pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:29.906746 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:29.906711 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9v756" Apr 20 07:03:30.114023 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:30.113924 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-sdjsh" podStartSLOduration=3.286384874 podStartE2EDuration="5.113903731s" podCreationTimestamp="2026-04-20 07:03:25 +0000 UTC" firstStartedPulling="2026-04-20 07:03:26.205811154 +0000 UTC m=+41.025879469" lastFinishedPulling="2026-04-20 07:03:28.033330011 +0000 UTC m=+42.853398326" observedRunningTime="2026-04-20 07:03:28.876987341 +0000 UTC m=+43.697055677" watchObservedRunningTime="2026-04-20 07:03:30.113903731 +0000 UTC m=+44.933972073" Apr 20 07:03:30.114513 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:30.114497 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9v756"] Apr 20 07:03:30.116858 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:30.116832 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod58183227_95cc_42f1_95e7_dc40dc8bf51d.slice/crio-2eb77885a55e9813d308c10dfee61a06bb6e2391eb7c2cfba3d3a64db1aa2693 WatchSource:0}: Error finding container 2eb77885a55e9813d308c10dfee61a06bb6e2391eb7c2cfba3d3a64db1aa2693: Status 404 returned error can't find the container with id 2eb77885a55e9813d308c10dfee61a06bb6e2391eb7c2cfba3d3a64db1aa2693 Apr 20 07:03:30.867343 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:30.867294 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9v756" event={"ID":"58183227-95cc-42f1-95e7-dc40dc8bf51d","Type":"ContainerStarted","Data":"2eb77885a55e9813d308c10dfee61a06bb6e2391eb7c2cfba3d3a64db1aa2693"} Apr 20 07:03:34.414538 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.414503 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:34.414538 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.414541 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:34.414948 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.414571 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:34.416909 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.416878 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/ff2f3e47-aed8-4328-8fce-7d39dbc0d939-metrics-tls\") pod \"dns-default-5kxb2\" (UID: \"ff2f3e47-aed8-4328-8fce-7d39dbc0d939\") " pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:34.417047 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.416987 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"image-registry-f8bbf6698-rjf5f\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:34.417089 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.417057 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/7a3a9194-e692-46ef-949c-604a76b49aad-cert\") pod \"ingress-canary-sw57v\" (UID: \"7a3a9194-e692-46ef-949c-604a76b49aad\") " pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:34.468778 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.468395 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:34.499015 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.498680 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:34.507288 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.507258 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-sw57v" Apr 20 07:03:34.666171 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.666101 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-f8bbf6698-rjf5f"] Apr 20 07:03:34.669165 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:34.669060 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb234c29_ad1a_4e18_b6ff_cb4363028abd.slice/crio-693d2d6b30aa5bb78e9033a3464e6bfa101365c1ad1943f79ea60f62a48c4376 WatchSource:0}: Error finding container 693d2d6b30aa5bb78e9033a3464e6bfa101365c1ad1943f79ea60f62a48c4376: Status 404 returned error can't find the container with id 693d2d6b30aa5bb78e9033a3464e6bfa101365c1ad1943f79ea60f62a48c4376 Apr 20 07:03:34.682052 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.682028 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-5kxb2"] Apr 20 07:03:34.686463 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:34.686440 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff2f3e47_aed8_4328_8fce_7d39dbc0d939.slice/crio-420f433aad94ad38036c35f086a5b86f677e6b689a2728f86c3b0cde45642bd5 WatchSource:0}: Error finding container 420f433aad94ad38036c35f086a5b86f677e6b689a2728f86c3b0cde45642bd5: Status 404 returned error can't find the container with id 420f433aad94ad38036c35f086a5b86f677e6b689a2728f86c3b0cde45642bd5 Apr 20 07:03:34.697928 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.697905 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-sw57v"] Apr 20 07:03:34.708048 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:34.708016 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a3a9194_e692_46ef_949c_604a76b49aad.slice/crio-d9cf8872ad7a5d92279616633ee8f13b2bb1928141a1cbdb5b41b74ea1c785fb WatchSource:0}: Error finding container d9cf8872ad7a5d92279616633ee8f13b2bb1928141a1cbdb5b41b74ea1c785fb: Status 404 returned error can't find the container with id d9cf8872ad7a5d92279616633ee8f13b2bb1928141a1cbdb5b41b74ea1c785fb Apr 20 07:03:34.876472 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.876433 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sw57v" event={"ID":"7a3a9194-e692-46ef-949c-604a76b49aad","Type":"ContainerStarted","Data":"d9cf8872ad7a5d92279616633ee8f13b2bb1928141a1cbdb5b41b74ea1c785fb"} Apr 20 07:03:34.877450 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.877417 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5kxb2" event={"ID":"ff2f3e47-aed8-4328-8fce-7d39dbc0d939","Type":"ContainerStarted","Data":"420f433aad94ad38036c35f086a5b86f677e6b689a2728f86c3b0cde45642bd5"} Apr 20 07:03:34.878701 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.878670 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" event={"ID":"eb234c29-ad1a-4e18-b6ff-cb4363028abd","Type":"ContainerStarted","Data":"af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4"} Apr 20 07:03:34.878793 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.878707 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" event={"ID":"eb234c29-ad1a-4e18-b6ff-cb4363028abd","Type":"ContainerStarted","Data":"693d2d6b30aa5bb78e9033a3464e6bfa101365c1ad1943f79ea60f62a48c4376"} Apr 20 07:03:34.878793 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.878775 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:03:34.879879 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.879859 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9v756" event={"ID":"58183227-95cc-42f1-95e7-dc40dc8bf51d","Type":"ContainerStarted","Data":"966b7a7bdd50424e2e657909267d5b1a16ea3f9a50245bdc191bdf234dbadbae"} Apr 20 07:03:34.924039 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.923963 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" podStartSLOduration=44.923950188 podStartE2EDuration="44.923950188s" podCreationTimestamp="2026-04-20 07:02:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:34.923128339 +0000 UTC m=+49.743196677" watchObservedRunningTime="2026-04-20 07:03:34.923950188 +0000 UTC m=+49.744018525" Apr 20 07:03:34.963650 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:34.963587 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9v756" podStartSLOduration=33.638090194 podStartE2EDuration="37.96356997s" podCreationTimestamp="2026-04-20 07:02:57 +0000 UTC" firstStartedPulling="2026-04-20 07:03:30.118670423 +0000 UTC m=+44.938738739" lastFinishedPulling="2026-04-20 07:03:34.444150197 +0000 UTC m=+49.264218515" observedRunningTime="2026-04-20 07:03:34.963134359 +0000 UTC m=+49.783202697" watchObservedRunningTime="2026-04-20 07:03:34.96356997 +0000 UTC m=+49.783638309" Apr 20 07:03:37.895677 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:37.895612 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5kxb2" event={"ID":"ff2f3e47-aed8-4328-8fce-7d39dbc0d939","Type":"ContainerStarted","Data":"36913417dbf9c25954309b028ac06330b4342c45d5cef2c030f767ea8ba67607"} Apr 20 07:03:37.898018 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:37.897920 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-sw57v" event={"ID":"7a3a9194-e692-46ef-949c-604a76b49aad","Type":"ContainerStarted","Data":"84497dfc1b39e6198de58d0e01cca297d88ec953700e6b3dd6d06fc0298aad85"} Apr 20 07:03:37.918443 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:37.918396 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-sw57v" podStartSLOduration=16.960160431 podStartE2EDuration="19.918384245s" podCreationTimestamp="2026-04-20 07:03:18 +0000 UTC" firstStartedPulling="2026-04-20 07:03:34.709891439 +0000 UTC m=+49.529959768" lastFinishedPulling="2026-04-20 07:03:37.668115267 +0000 UTC m=+52.488183582" observedRunningTime="2026-04-20 07:03:37.918089748 +0000 UTC m=+52.738158086" watchObservedRunningTime="2026-04-20 07:03:37.918384245 +0000 UTC m=+52.738452582" Apr 20 07:03:38.902073 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:38.902040 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-5kxb2" event={"ID":"ff2f3e47-aed8-4328-8fce-7d39dbc0d939","Type":"ContainerStarted","Data":"be816e07f323f9a5f516ba9ce5ccecde558251c9328b0e2b29ff6ad7dfac4234"} Apr 20 07:03:38.922482 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:38.922437 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-5kxb2" podStartSLOduration=17.946983171 podStartE2EDuration="20.922425054s" podCreationTimestamp="2026-04-20 07:03:18 +0000 UTC" firstStartedPulling="2026-04-20 07:03:34.688291171 +0000 UTC m=+49.508359499" lastFinishedPulling="2026-04-20 07:03:37.663733061 +0000 UTC m=+52.483801382" observedRunningTime="2026-04-20 07:03:38.922131001 +0000 UTC m=+53.742199336" watchObservedRunningTime="2026-04-20 07:03:38.922425054 +0000 UTC m=+53.742493391" Apr 20 07:03:39.907439 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:39.907399 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:43.827029 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:43.827003 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-5bfnd" Apr 20 07:03:49.875137 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.875106 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt"] Apr 20 07:03:49.909832 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.909799 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj"] Apr 20 07:03:49.909974 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.909964 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:49.914038 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.914015 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 20 07:03:49.915942 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.915920 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 20 07:03:49.917911 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.917890 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 20 07:03:49.935879 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.935856 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 20 07:03:49.936780 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.936765 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-5kxb2" Apr 20 07:03:49.936848 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.936786 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj"] Apr 20 07:03:49.936913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.936893 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:49.943946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.943927 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 20 07:03:49.944142 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.944129 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 20 07:03:49.944211 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.944196 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 20 07:03:49.944292 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.944279 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 20 07:03:49.953559 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:49.953526 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt"] Apr 20 07:03:50.010910 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.010879 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1638568c-490a-48b5-8755-5858b13e5c9a-tmp\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.010910 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.010909 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2thl4\" (UniqueName: \"kubernetes.io/projected/1638568c-490a-48b5-8755-5858b13e5c9a-kube-api-access-2thl4\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.011108 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.010939 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.011108 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.010966 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.011108 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.011029 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/08532c11-93ba-4a45-9756-726c663f9c88-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.011206 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.011149 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-hub\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.011238 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.011203 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5f9h\" (UniqueName: \"kubernetes.io/projected/08532c11-93ba-4a45-9756-726c663f9c88-kube-api-access-f5f9h\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.011271 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.011247 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1638568c-490a-48b5-8755-5858b13e5c9a-klusterlet-config\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.011303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.011288 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-ca\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.102062 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.102033 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f8bbf6698-rjf5f"] Apr 20 07:03:50.106059 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.106026 2543 patch_prober.go:28] interesting pod/image-registry-f8bbf6698-rjf5f container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 20 07:03:50.106206 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.106074 2543 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" podUID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 20 07:03:50.111565 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111541 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-hub\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.111702 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111574 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5f9h\" (UniqueName: \"kubernetes.io/projected/08532c11-93ba-4a45-9756-726c663f9c88-kube-api-access-f5f9h\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.111702 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111596 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1638568c-490a-48b5-8755-5858b13e5c9a-klusterlet-config\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.111702 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111613 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-ca\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.111702 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111677 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1638568c-490a-48b5-8755-5858b13e5c9a-tmp\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.111907 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111705 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2thl4\" (UniqueName: \"kubernetes.io/projected/1638568c-490a-48b5-8755-5858b13e5c9a-kube-api-access-2thl4\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.111907 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111735 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.111907 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111777 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.111907 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.111811 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/08532c11-93ba-4a45-9756-726c663f9c88-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.112413 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.112393 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1638568c-490a-48b5-8755-5858b13e5c9a-tmp\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.112691 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.112635 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/08532c11-93ba-4a45-9756-726c663f9c88-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.114551 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.114527 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-hub\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.114712 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.114685 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/1638568c-490a-48b5-8755-5858b13e5c9a-klusterlet-config\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.116950 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.116930 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-fd69r"] Apr 20 07:03:50.118373 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.118353 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-ca\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.118373 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.118363 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.118589 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.118556 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/08532c11-93ba-4a45-9756-726c663f9c88-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.139514 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.139444 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.142842 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.142822 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 20 07:03:50.143001 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.142988 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nfwjb\"" Apr 20 07:03:50.144822 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.144802 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 20 07:03:50.148829 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.148812 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 20 07:03:50.148978 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.148961 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 20 07:03:50.166302 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.166274 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2thl4\" (UniqueName: \"kubernetes.io/projected/1638568c-490a-48b5-8755-5858b13e5c9a-kube-api-access-2thl4\") pod \"klusterlet-addon-workmgr-f496cd8b-5qwwt\" (UID: \"1638568c-490a-48b5-8755-5858b13e5c9a\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.208876 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.208851 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5f9h\" (UniqueName: \"kubernetes.io/projected/08532c11-93ba-4a45-9756-726c663f9c88-kube-api-access-f5f9h\") pod \"cluster-proxy-proxy-agent-79566764d7-lsqhj\" (UID: \"08532c11-93ba-4a45-9756-726c663f9c88\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.212913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.212883 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/768e3c1e-465b-460a-a6b0-4f956e56e270-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.213064 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.212969 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/768e3c1e-465b-460a-a6b0-4f956e56e270-crio-socket\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.213064 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.212998 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/768e3c1e-465b-460a-a6b0-4f956e56e270-data-volume\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.213064 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.213040 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tk2bf\" (UniqueName: \"kubernetes.io/projected/768e3c1e-465b-460a-a6b0-4f956e56e270-kube-api-access-tk2bf\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.213172 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.213092 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/768e3c1e-465b-460a-a6b0-4f956e56e270-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.218795 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.218780 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:50.258811 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.258774 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.314882 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tk2bf\" (UniqueName: \"kubernetes.io/projected/768e3c1e-465b-460a-a6b0-4f956e56e270-kube-api-access-tk2bf\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.314928 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/768e3c1e-465b-460a-a6b0-4f956e56e270-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.314961 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/768e3c1e-465b-460a-a6b0-4f956e56e270-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.315031 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/768e3c1e-465b-460a-a6b0-4f956e56e270-crio-socket\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.315055 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/768e3c1e-465b-460a-a6b0-4f956e56e270-data-volume\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.315418 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/768e3c1e-465b-460a-a6b0-4f956e56e270-data-volume\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.315741 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/768e3c1e-465b-460a-a6b0-4f956e56e270-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.316305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.316064 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/768e3c1e-465b-460a-a6b0-4f956e56e270-crio-socket\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.319205 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.319177 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/768e3c1e-465b-460a-a6b0-4f956e56e270-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.323085 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.323055 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fd69r"] Apr 20 07:03:50.373603 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.373576 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tk2bf\" (UniqueName: \"kubernetes.io/projected/768e3c1e-465b-460a-a6b0-4f956e56e270-kube-api-access-tk2bf\") pod \"insights-runtime-extractor-fd69r\" (UID: \"768e3c1e-465b-460a-a6b0-4f956e56e270\") " pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.441319 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:50.441292 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1638568c_490a_48b5_8755_5858b13e5c9a.slice/crio-f009b7551d304a27ca2b69eb45ce459e84e9cb8dc0b3207cd61f296b8bd73f4d WatchSource:0}: Error finding container f009b7551d304a27ca2b69eb45ce459e84e9cb8dc0b3207cd61f296b8bd73f4d: Status 404 returned error can't find the container with id f009b7551d304a27ca2b69eb45ce459e84e9cb8dc0b3207cd61f296b8bd73f4d Apr 20 07:03:50.448535 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.448509 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-fd69r" Apr 20 07:03:50.452382 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.452358 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt"] Apr 20 07:03:50.524744 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:50.524711 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod08532c11_93ba_4a45_9756_726c663f9c88.slice/crio-f35130fc39a01483922a643c3b079640ce468c505dab728502d9d468fb1d4dd2 WatchSource:0}: Error finding container f35130fc39a01483922a643c3b079640ce468c505dab728502d9d468fb1d4dd2: Status 404 returned error can't find the container with id f35130fc39a01483922a643c3b079640ce468c505dab728502d9d468fb1d4dd2 Apr 20 07:03:50.549913 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.549884 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj"] Apr 20 07:03:50.667147 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.667111 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-86468fb89f-ptbl4"] Apr 20 07:03:50.688118 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.688094 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717548 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717525 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-bound-sa-token\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717672 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717560 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4346ef67-5625-4eac-bc03-69ca61915d3a-trusted-ca\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717672 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717585 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4346ef67-5625-4eac-bc03-69ca61915d3a-image-registry-private-configuration\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717753 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717694 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4346ef67-5625-4eac-bc03-69ca61915d3a-registry-certificates\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717753 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717724 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcbq8\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-kube-api-access-lcbq8\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717814 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717756 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4346ef67-5625-4eac-bc03-69ca61915d3a-installation-pull-secrets\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717814 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717781 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4346ef67-5625-4eac-bc03-69ca61915d3a-ca-trust-extracted\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.717814 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.717800 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-registry-tls\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.800166 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.800136 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86468fb89f-ptbl4"] Apr 20 07:03:50.818738 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818704 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lcbq8\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-kube-api-access-lcbq8\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.818869 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818748 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4346ef67-5625-4eac-bc03-69ca61915d3a-installation-pull-secrets\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.818869 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818775 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4346ef67-5625-4eac-bc03-69ca61915d3a-ca-trust-extracted\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.818869 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818802 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-registry-tls\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819029 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818899 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-bound-sa-token\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819029 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818932 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4346ef67-5625-4eac-bc03-69ca61915d3a-trusted-ca\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819029 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.818968 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4346ef67-5625-4eac-bc03-69ca61915d3a-image-registry-private-configuration\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819232 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.819206 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4346ef67-5625-4eac-bc03-69ca61915d3a-registry-certificates\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819315 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.819237 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4346ef67-5625-4eac-bc03-69ca61915d3a-ca-trust-extracted\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819879 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.819859 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4346ef67-5625-4eac-bc03-69ca61915d3a-trusted-ca\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.819997 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.819943 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4346ef67-5625-4eac-bc03-69ca61915d3a-registry-certificates\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.821256 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.821239 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4346ef67-5625-4eac-bc03-69ca61915d3a-image-registry-private-configuration\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.821541 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.821523 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4346ef67-5625-4eac-bc03-69ca61915d3a-installation-pull-secrets\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.821743 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.821717 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-registry-tls\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.850433 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.850405 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-bound-sa-token\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.853617 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.853595 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcbq8\" (UniqueName: \"kubernetes.io/projected/4346ef67-5625-4eac-bc03-69ca61915d3a-kube-api-access-lcbq8\") pod \"image-registry-86468fb89f-ptbl4\" (UID: \"4346ef67-5625-4eac-bc03-69ca61915d3a\") " pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:50.934700 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.934672 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" event={"ID":"08532c11-93ba-4a45-9756-726c663f9c88","Type":"ContainerStarted","Data":"f35130fc39a01483922a643c3b079640ce468c505dab728502d9d468fb1d4dd2"} Apr 20 07:03:50.935587 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.935565 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" event={"ID":"1638568c-490a-48b5-8755-5858b13e5c9a","Type":"ContainerStarted","Data":"f009b7551d304a27ca2b69eb45ce459e84e9cb8dc0b3207cd61f296b8bd73f4d"} Apr 20 07:03:50.996712 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:50.996653 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:51.020153 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.020120 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-fd69r"] Apr 20 07:03:51.029991 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:51.029958 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod768e3c1e_465b_460a_a6b0_4f956e56e270.slice/crio-81034b5721174ff27a0a884b980852703b53577dd6a8a4819ed60d1be4899206 WatchSource:0}: Error finding container 81034b5721174ff27a0a884b980852703b53577dd6a8a4819ed60d1be4899206: Status 404 returned error can't find the container with id 81034b5721174ff27a0a884b980852703b53577dd6a8a4819ed60d1be4899206 Apr 20 07:03:51.210852 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.210802 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-86468fb89f-ptbl4"] Apr 20 07:03:51.230399 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:03:51.230363 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4346ef67_5625_4eac_bc03_69ca61915d3a.slice/crio-31a6797d5955e562cc4d115b0fd91346d996f8f45ec1c129e8bc2f303e761131 WatchSource:0}: Error finding container 31a6797d5955e562cc4d115b0fd91346d996f8f45ec1c129e8bc2f303e761131: Status 404 returned error can't find the container with id 31a6797d5955e562cc4d115b0fd91346d996f8f45ec1c129e8bc2f303e761131 Apr 20 07:03:51.438047 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.438012 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:51.442594 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.442539 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/4ffbcc28-a10c-467a-a9e7-e31e20e4975e-metrics-certs\") pod \"network-metrics-daemon-wgd5x\" (UID: \"4ffbcc28-a10c-467a-a9e7-e31e20e4975e\") " pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:51.715165 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.714903 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-mf5sv\"" Apr 20 07:03:51.719463 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.719077 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wgd5x" Apr 20 07:03:51.948182 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.944295 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wgd5x"] Apr 20 07:03:51.967046 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.966173 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" event={"ID":"4346ef67-5625-4eac-bc03-69ca61915d3a","Type":"ContainerStarted","Data":"ff0ff34f581aa94aed88199016a93c56f1b8c289e7b2e74fc1b791312669b13c"} Apr 20 07:03:51.967046 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.966214 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" event={"ID":"4346ef67-5625-4eac-bc03-69ca61915d3a","Type":"ContainerStarted","Data":"31a6797d5955e562cc4d115b0fd91346d996f8f45ec1c129e8bc2f303e761131"} Apr 20 07:03:51.967046 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.966968 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:03:51.970502 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.970439 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fd69r" event={"ID":"768e3c1e-465b-460a-a6b0-4f956e56e270","Type":"ContainerStarted","Data":"134b92ebd402db572b5d76de69bfeaadcdc83699dcb5a5f3cf6f303553411887"} Apr 20 07:03:51.970502 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:51.970469 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fd69r" event={"ID":"768e3c1e-465b-460a-a6b0-4f956e56e270","Type":"ContainerStarted","Data":"81034b5721174ff27a0a884b980852703b53577dd6a8a4819ed60d1be4899206"} Apr 20 07:03:52.021683 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:52.021174 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" podStartSLOduration=2.021153376 podStartE2EDuration="2.021153376s" podCreationTimestamp="2026-04-20 07:03:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:03:52.020159961 +0000 UTC m=+66.840228301" watchObservedRunningTime="2026-04-20 07:03:52.021153376 +0000 UTC m=+66.841221714" Apr 20 07:03:52.974376 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:52.974338 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wgd5x" event={"ID":"4ffbcc28-a10c-467a-a9e7-e31e20e4975e","Type":"ContainerStarted","Data":"06f7912bf26cab7cb1438c21c6e5ddf67210892820f77b3241b59b2cc48aaf4b"} Apr 20 07:03:52.976356 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:52.976329 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fd69r" event={"ID":"768e3c1e-465b-460a-a6b0-4f956e56e270","Type":"ContainerStarted","Data":"16a8b06f4a83c6d90d8561c5008671a36e9522b34d7634be5761ebeeb7741d8c"} Apr 20 07:03:54.852235 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:54.852191 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-r9xht" Apr 20 07:03:55.987831 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:55.987785 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" event={"ID":"08532c11-93ba-4a45-9756-726c663f9c88","Type":"ContainerStarted","Data":"f3395395194e7b251b6c1019e269ebc9edb9d709e136366b3cd3709b62cf49fc"} Apr 20 07:03:55.989352 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:55.989321 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" event={"ID":"1638568c-490a-48b5-8755-5858b13e5c9a","Type":"ContainerStarted","Data":"bba7395b83576ef9309bb7c871467599fbea77dd84216c3ab81ad7f814b20742"} Apr 20 07:03:55.989569 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:55.989544 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:55.991161 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:55.991133 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wgd5x" event={"ID":"4ffbcc28-a10c-467a-a9e7-e31e20e4975e","Type":"ContainerStarted","Data":"042f29185a196f4ce8f2ae16d051027c60215819e5540bb249d4c93b33d876e9"} Apr 20 07:03:55.991273 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:55.991181 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wgd5x" event={"ID":"4ffbcc28-a10c-467a-a9e7-e31e20e4975e","Type":"ContainerStarted","Data":"57b9f078cffd50f3ab85549abcdd898f0f00357fd057bcf479b74ccabcb11364"} Apr 20 07:03:55.991684 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:55.991654 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" Apr 20 07:03:56.040575 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:56.040521 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f496cd8b-5qwwt" podStartSLOduration=2.190601932 podStartE2EDuration="7.040501461s" podCreationTimestamp="2026-04-20 07:03:49 +0000 UTC" firstStartedPulling="2026-04-20 07:03:50.443053759 +0000 UTC m=+65.263122078" lastFinishedPulling="2026-04-20 07:03:55.292953286 +0000 UTC m=+70.113021607" observedRunningTime="2026-04-20 07:03:56.038417512 +0000 UTC m=+70.858485864" watchObservedRunningTime="2026-04-20 07:03:56.040501461 +0000 UTC m=+70.860569799" Apr 20 07:03:56.154688 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:56.154612 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wgd5x" podStartSLOduration=67.810979592 podStartE2EDuration="1m11.15459149s" podCreationTimestamp="2026-04-20 07:02:45 +0000 UTC" firstStartedPulling="2026-04-20 07:03:51.9523518 +0000 UTC m=+66.772420119" lastFinishedPulling="2026-04-20 07:03:55.295963701 +0000 UTC m=+70.116032017" observedRunningTime="2026-04-20 07:03:56.102783049 +0000 UTC m=+70.922851383" watchObservedRunningTime="2026-04-20 07:03:56.15459149 +0000 UTC m=+70.974659828" Apr 20 07:03:56.996128 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:56.996092 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-fd69r" event={"ID":"768e3c1e-465b-460a-a6b0-4f956e56e270","Type":"ContainerStarted","Data":"749ff152e6c1832f102a2b7bfe73e1bc9dd9eabc74fb6a900dcbc5e6af567a2a"} Apr 20 07:03:57.047140 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:57.047076 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-fd69r" podStartSLOduration=2.034923748 podStartE2EDuration="7.047056003s" podCreationTimestamp="2026-04-20 07:03:50 +0000 UTC" firstStartedPulling="2026-04-20 07:03:51.145368579 +0000 UTC m=+65.965436908" lastFinishedPulling="2026-04-20 07:03:56.157500832 +0000 UTC m=+70.977569163" observedRunningTime="2026-04-20 07:03:57.04561207 +0000 UTC m=+71.865680409" watchObservedRunningTime="2026-04-20 07:03:57.047056003 +0000 UTC m=+71.867124341" Apr 20 07:03:59.003282 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:59.003248 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" event={"ID":"08532c11-93ba-4a45-9756-726c663f9c88","Type":"ContainerStarted","Data":"5cd948c625401afdbb78f5b8537018eaaa99681375533575cdb411680c1f7f37"} Apr 20 07:03:59.003282 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:59.003288 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" event={"ID":"08532c11-93ba-4a45-9756-726c663f9c88","Type":"ContainerStarted","Data":"af3ae154b2d445aae0d7e0f52a3b332866048f09e3a422352d6ab6673927ff55"} Apr 20 07:03:59.091426 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:03:59.091372 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79566764d7-lsqhj" podStartSLOduration=2.700552847 podStartE2EDuration="10.091356876s" podCreationTimestamp="2026-04-20 07:03:49 +0000 UTC" firstStartedPulling="2026-04-20 07:03:50.527366594 +0000 UTC m=+65.347434927" lastFinishedPulling="2026-04-20 07:03:57.918170627 +0000 UTC m=+72.738238956" observedRunningTime="2026-04-20 07:03:59.084582285 +0000 UTC m=+73.904650622" watchObservedRunningTime="2026-04-20 07:03:59.091356876 +0000 UTC m=+73.911425358" Apr 20 07:04:00.106263 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:00.106231 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:04:02.735946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.735912 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-2qnqf"] Apr 20 07:04:02.739667 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.739626 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.753150 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.753129 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 20 07:04:02.753271 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.753129 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 20 07:04:02.753743 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.753724 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 20 07:04:02.753743 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.753739 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 20 07:04:02.753884 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.753741 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 20 07:04:02.757123 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:04:02.757104 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"node-exporter-dockercfg-f8v22\" is forbidden: User \"system:node:ip-10-0-130-105.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-130-105.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f8v22\"" type="*v1.Secret" Apr 20 07:04:02.757179 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:04:02.757161 2543 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"node-exporter-accelerators-collector-config\" is forbidden: User \"system:node:ip-10-0-130-105.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'ip-10-0-130-105.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" type="*v1.ConfigMap" Apr 20 07:04:02.826548 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826514 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-root\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826565 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-wtmp\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826587 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826678 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826714 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-tls\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826742 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gg8f7\" (UniqueName: \"kubernetes.io/projected/b1c3d808-af41-473f-beb5-7fb97e343128-kube-api-access-gg8f7\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826768 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-metrics-client-ca\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826802 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-textfile\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.826956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.826828 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-sys\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928123 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928082 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-textfile\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928123 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928128 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-sys\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928156 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-root\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928206 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-wtmp\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928232 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928237 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-root\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928204 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-sys\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928266 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928324 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-tls\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928351 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-wtmp\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928363 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gg8f7\" (UniqueName: \"kubernetes.io/projected/b1c3d808-af41-473f-beb5-7fb97e343128-kube-api-access-gg8f7\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928745 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928409 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-metrics-client-ca\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.928745 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.928478 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-textfile\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.929410 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.929390 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-metrics-client-ca\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.930548 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.930530 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.930682 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.930665 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-tls\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:02.987632 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:02.987556 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gg8f7\" (UniqueName: \"kubernetes.io/projected/b1c3d808-af41-473f-beb5-7fb97e343128-kube-api-access-gg8f7\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:03.593923 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:03.593891 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-f8v22\"" Apr 20 07:04:03.928781 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:04:03.928691 2543 configmap.go:193] Couldn't get configMap openshift-monitoring/node-exporter-accelerators-collector-config: failed to sync configmap cache: timed out waiting for the condition Apr 20 07:04:03.929136 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:04:03.928793 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-accelerators-collector-config podName:b1c3d808-af41-473f-beb5-7fb97e343128 nodeName:}" failed. No retries permitted until 2026-04-20 07:04:04.428775403 +0000 UTC m=+79.248843731 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-accelerators-collector-config" (UniqueName: "kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-accelerators-collector-config") pod "node-exporter-2qnqf" (UID: "b1c3d808-af41-473f-beb5-7fb97e343128") : failed to sync configmap cache: timed out waiting for the condition Apr 20 07:04:04.207005 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:04.206975 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 20 07:04:04.439513 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:04.439469 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:04.440096 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:04.440076 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/b1c3d808-af41-473f-beb5-7fb97e343128-node-exporter-accelerators-collector-config\") pod \"node-exporter-2qnqf\" (UID: \"b1c3d808-af41-473f-beb5-7fb97e343128\") " pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:04.548418 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:04.548347 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-2qnqf" Apr 20 07:04:04.556873 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:04:04.556841 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1c3d808_af41_473f_beb5_7fb97e343128.slice/crio-222025c090ad9672961a1cbda977de1fc3b1d46a79662d631288337c8d4c4712 WatchSource:0}: Error finding container 222025c090ad9672961a1cbda977de1fc3b1d46a79662d631288337c8d4c4712: Status 404 returned error can't find the container with id 222025c090ad9672961a1cbda977de1fc3b1d46a79662d631288337c8d4c4712 Apr 20 07:04:05.022881 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:05.022835 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qnqf" event={"ID":"b1c3d808-af41-473f-beb5-7fb97e343128","Type":"ContainerStarted","Data":"222025c090ad9672961a1cbda977de1fc3b1d46a79662d631288337c8d4c4712"} Apr 20 07:04:06.026893 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:06.026817 2543 generic.go:358] "Generic (PLEG): container finished" podID="b1c3d808-af41-473f-beb5-7fb97e343128" containerID="3e2975ea1ff47d2a5ddd791e430bd3531496e284ab9d220b8fdd7259214a1d0c" exitCode=0 Apr 20 07:04:06.026893 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:06.026870 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qnqf" event={"ID":"b1c3d808-af41-473f-beb5-7fb97e343128","Type":"ContainerDied","Data":"3e2975ea1ff47d2a5ddd791e430bd3531496e284ab9d220b8fdd7259214a1d0c"} Apr 20 07:04:07.031559 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:07.031522 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qnqf" event={"ID":"b1c3d808-af41-473f-beb5-7fb97e343128","Type":"ContainerStarted","Data":"2c7571c0d3d7fedd61db898326432afebc697ea6b08c0e8be3811e638591afc3"} Apr 20 07:04:07.031559 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:07.031562 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-2qnqf" event={"ID":"b1c3d808-af41-473f-beb5-7fb97e343128","Type":"ContainerStarted","Data":"f43449bb16519f8a11e51c351286c72f5755602ecf6d71d988b73fef67df01c2"} Apr 20 07:04:07.066676 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:07.066610 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-2qnqf" podStartSLOduration=3.927259671 podStartE2EDuration="5.066595675s" podCreationTimestamp="2026-04-20 07:04:02 +0000 UTC" firstStartedPulling="2026-04-20 07:04:04.558607226 +0000 UTC m=+79.378675542" lastFinishedPulling="2026-04-20 07:04:05.697943231 +0000 UTC m=+80.518011546" observedRunningTime="2026-04-20 07:04:07.06656319 +0000 UTC m=+81.886631537" watchObservedRunningTime="2026-04-20 07:04:07.066595675 +0000 UTC m=+81.886663994" Apr 20 07:04:13.983576 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:13.983549 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-86468fb89f-ptbl4" Apr 20 07:04:15.122619 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.122547 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" podUID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" containerName="registry" containerID="cri-o://af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4" gracePeriod=30 Apr 20 07:04:15.353671 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.353630 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:04:15.423312 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423245 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb234c29-ad1a-4e18-b6ff-cb4363028abd-ca-trust-extracted\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423312 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423290 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-image-registry-private-configuration\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423312 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423312 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-installation-pull-secrets\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423540 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423338 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-bound-sa-token\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423540 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423360 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-certificates\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423540 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423380 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddfjv\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-kube-api-access-ddfjv\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423540 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423395 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423540 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423442 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-trusted-ca\") pod \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\" (UID: \"eb234c29-ad1a-4e18-b6ff-cb4363028abd\") " Apr 20 07:04:15.423975 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423950 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:15.424081 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.423947 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:04:15.426014 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.425985 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:15.426116 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.426064 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:15.426175 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.426134 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:15.426175 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.426158 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:04:15.426281 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.426232 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-kube-api-access-ddfjv" (OuterVolumeSpecName: "kube-api-access-ddfjv") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "kube-api-access-ddfjv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:04:15.431752 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.431718 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eb234c29-ad1a-4e18-b6ff-cb4363028abd-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "eb234c29-ad1a-4e18-b6ff-cb4363028abd" (UID: "eb234c29-ad1a-4e18-b6ff-cb4363028abd"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 20 07:04:15.524821 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524784 2543 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eb234c29-ad1a-4e18-b6ff-cb4363028abd-ca-trust-extracted\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.524821 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524819 2543 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-image-registry-private-configuration\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.524821 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524830 2543 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eb234c29-ad1a-4e18-b6ff-cb4363028abd-installation-pull-secrets\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.525004 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524840 2543 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-bound-sa-token\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.525004 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524850 2543 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-certificates\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.525004 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524859 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ddfjv\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-kube-api-access-ddfjv\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.525004 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524867 2543 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eb234c29-ad1a-4e18-b6ff-cb4363028abd-registry-tls\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:15.525004 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:15.524875 2543 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eb234c29-ad1a-4e18-b6ff-cb4363028abd-trusted-ca\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:04:16.058685 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.058579 2543 generic.go:358] "Generic (PLEG): container finished" podID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" containerID="af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4" exitCode=0 Apr 20 07:04:16.058685 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.058619 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" event={"ID":"eb234c29-ad1a-4e18-b6ff-cb4363028abd","Type":"ContainerDied","Data":"af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4"} Apr 20 07:04:16.058685 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.058661 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" event={"ID":"eb234c29-ad1a-4e18-b6ff-cb4363028abd","Type":"ContainerDied","Data":"693d2d6b30aa5bb78e9033a3464e6bfa101365c1ad1943f79ea60f62a48c4376"} Apr 20 07:04:16.058685 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.058672 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-f8bbf6698-rjf5f" Apr 20 07:04:16.058995 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.058676 2543 scope.go:117] "RemoveContainer" containerID="af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4" Apr 20 07:04:16.066203 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.066185 2543 scope.go:117] "RemoveContainer" containerID="af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4" Apr 20 07:04:16.066470 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:04:16.066448 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4\": container with ID starting with af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4 not found: ID does not exist" containerID="af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4" Apr 20 07:04:16.066526 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.066478 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4"} err="failed to get container status \"af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4\": rpc error: code = NotFound desc = could not find container \"af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4\": container with ID starting with af52bbf061e4d107761cedac3e564cfdd282ed32d4a2eab1010c7831a2ec52c4 not found: ID does not exist" Apr 20 07:04:16.117374 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.117342 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-f8bbf6698-rjf5f"] Apr 20 07:04:16.122927 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:16.122899 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-f8bbf6698-rjf5f"] Apr 20 07:04:17.700585 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:04:17.700551 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" path="/var/lib/kubelet/pods/eb234c29-ad1a-4e18-b6ff-cb4363028abd/volumes" Apr 20 07:07:45.648689 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:07:45.648657 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:07:45.649205 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:07:45.648796 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:07:45.652071 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:07:45.652049 2543 kubelet.go:1628] "Image garbage collection succeeded" Apr 20 07:12:25.983974 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.983939 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-c56jg"] Apr 20 07:12:25.984422 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.984168 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" containerName="registry" Apr 20 07:12:25.984422 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.984178 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" containerName="registry" Apr 20 07:12:25.984422 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.984225 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb234c29-ad1a-4e18-b6ff-cb4363028abd" containerName="registry" Apr 20 07:12:25.986793 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.986776 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:25.990352 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.990328 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 20 07:12:25.990480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.990354 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 20 07:12:25.991221 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.991206 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-nrhfx\"" Apr 20 07:12:25.997201 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:25.997181 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-c56jg"] Apr 20 07:12:26.085391 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.085360 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4747ea-ce0d-4dca-b325-2f451d77497c-bound-sa-token\") pod \"cert-manager-759f64656b-c56jg\" (UID: \"2f4747ea-ce0d-4dca-b325-2f451d77497c\") " pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.085535 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.085397 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbcxp\" (UniqueName: \"kubernetes.io/projected/2f4747ea-ce0d-4dca-b325-2f451d77497c-kube-api-access-fbcxp\") pod \"cert-manager-759f64656b-c56jg\" (UID: \"2f4747ea-ce0d-4dca-b325-2f451d77497c\") " pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.186787 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.186748 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4747ea-ce0d-4dca-b325-2f451d77497c-bound-sa-token\") pod \"cert-manager-759f64656b-c56jg\" (UID: \"2f4747ea-ce0d-4dca-b325-2f451d77497c\") " pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.186787 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.186793 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fbcxp\" (UniqueName: \"kubernetes.io/projected/2f4747ea-ce0d-4dca-b325-2f451d77497c-kube-api-access-fbcxp\") pod \"cert-manager-759f64656b-c56jg\" (UID: \"2f4747ea-ce0d-4dca-b325-2f451d77497c\") " pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.195459 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.195430 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/2f4747ea-ce0d-4dca-b325-2f451d77497c-bound-sa-token\") pod \"cert-manager-759f64656b-c56jg\" (UID: \"2f4747ea-ce0d-4dca-b325-2f451d77497c\") " pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.195671 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.195631 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbcxp\" (UniqueName: \"kubernetes.io/projected/2f4747ea-ce0d-4dca-b325-2f451d77497c-kube-api-access-fbcxp\") pod \"cert-manager-759f64656b-c56jg\" (UID: \"2f4747ea-ce0d-4dca-b325-2f451d77497c\") " pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.295187 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.295096 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-c56jg" Apr 20 07:12:26.410299 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.410268 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-c56jg"] Apr 20 07:12:26.413567 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:12:26.413534 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2f4747ea_ce0d_4dca_b325_2f451d77497c.slice/crio-9814d6fd5e4ca6ac84bdb25627f402560155a8802992321836c4dde4b1a7c921 WatchSource:0}: Error finding container 9814d6fd5e4ca6ac84bdb25627f402560155a8802992321836c4dde4b1a7c921: Status 404 returned error can't find the container with id 9814d6fd5e4ca6ac84bdb25627f402560155a8802992321836c4dde4b1a7c921 Apr 20 07:12:26.415350 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:26.415331 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:12:27.281206 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:27.281166 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-c56jg" event={"ID":"2f4747ea-ce0d-4dca-b325-2f451d77497c","Type":"ContainerStarted","Data":"9814d6fd5e4ca6ac84bdb25627f402560155a8802992321836c4dde4b1a7c921"} Apr 20 07:12:30.290876 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:30.290838 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-c56jg" event={"ID":"2f4747ea-ce0d-4dca-b325-2f451d77497c","Type":"ContainerStarted","Data":"143d8dc1a230ae14dca67ad0ccf57b84cead7af21ab6cf1f6105841a950e7d58"} Apr 20 07:12:30.309634 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:30.309584 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-c56jg" podStartSLOduration=2.21167407 podStartE2EDuration="5.30957093s" podCreationTimestamp="2026-04-20 07:12:25 +0000 UTC" firstStartedPulling="2026-04-20 07:12:26.415459908 +0000 UTC m=+581.235528227" lastFinishedPulling="2026-04-20 07:12:29.513356768 +0000 UTC m=+584.333425087" observedRunningTime="2026-04-20 07:12:30.308490397 +0000 UTC m=+585.128558734" watchObservedRunningTime="2026-04-20 07:12:30.30957093 +0000 UTC m=+585.129639266" Apr 20 07:12:34.699491 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.699455 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p"] Apr 20 07:12:34.702435 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.702419 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.705620 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.705601 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 20 07:12:34.706836 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.706816 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 20 07:12:34.706938 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.706846 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-nn82n\"" Apr 20 07:12:34.708898 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.708855 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 20 07:12:34.709438 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.709418 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 20 07:12:34.709934 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.709919 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 20 07:12:34.729920 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.729895 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p"] Apr 20 07:12:34.852606 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.852573 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba26dfdd-dc3c-4027-9922-6ef89186d95d-cert\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.852606 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.852610 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ba26dfdd-dc3c-4027-9922-6ef89186d95d-manager-config\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.852857 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.852634 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8v6r7\" (UniqueName: \"kubernetes.io/projected/ba26dfdd-dc3c-4027-9922-6ef89186d95d-kube-api-access-8v6r7\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.852857 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.852716 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba26dfdd-dc3c-4027-9922-6ef89186d95d-metrics-cert\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.953994 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.953923 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba26dfdd-dc3c-4027-9922-6ef89186d95d-cert\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.953994 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.953956 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ba26dfdd-dc3c-4027-9922-6ef89186d95d-manager-config\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.953994 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.953979 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8v6r7\" (UniqueName: \"kubernetes.io/projected/ba26dfdd-dc3c-4027-9922-6ef89186d95d-kube-api-access-8v6r7\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.954243 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.953999 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba26dfdd-dc3c-4027-9922-6ef89186d95d-metrics-cert\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.954718 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.954698 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/ba26dfdd-dc3c-4027-9922-6ef89186d95d-manager-config\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.956372 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.956354 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ba26dfdd-dc3c-4027-9922-6ef89186d95d-cert\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.956458 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.956381 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/ba26dfdd-dc3c-4027-9922-6ef89186d95d-metrics-cert\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:34.964403 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:34.964381 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8v6r7\" (UniqueName: \"kubernetes.io/projected/ba26dfdd-dc3c-4027-9922-6ef89186d95d-kube-api-access-8v6r7\") pod \"lws-controller-manager-7fd89bcbc4-vqr2p\" (UID: \"ba26dfdd-dc3c-4027-9922-6ef89186d95d\") " pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:35.011441 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:35.011406 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:35.138256 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:35.138225 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p"] Apr 20 07:12:35.141212 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:12:35.141182 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba26dfdd_dc3c_4027_9922_6ef89186d95d.slice/crio-9eccc0f8d5e93fc773f3efdcb90556b4c4b8aa8a9ea7ee4c51c06a68c21ea6e2 WatchSource:0}: Error finding container 9eccc0f8d5e93fc773f3efdcb90556b4c4b8aa8a9ea7ee4c51c06a68c21ea6e2: Status 404 returned error can't find the container with id 9eccc0f8d5e93fc773f3efdcb90556b4c4b8aa8a9ea7ee4c51c06a68c21ea6e2 Apr 20 07:12:35.304617 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:35.304532 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" event={"ID":"ba26dfdd-dc3c-4027-9922-6ef89186d95d","Type":"ContainerStarted","Data":"9eccc0f8d5e93fc773f3efdcb90556b4c4b8aa8a9ea7ee4c51c06a68c21ea6e2"} Apr 20 07:12:38.312743 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:38.312707 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" event={"ID":"ba26dfdd-dc3c-4027-9922-6ef89186d95d","Type":"ContainerStarted","Data":"fb79f899d73f91788f7a44c0b122c8035d1d8c3f11b24dbdc58f9ac85f4086d4"} Apr 20 07:12:38.313122 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:38.312826 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:38.335198 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:38.335155 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" podStartSLOduration=2.04140029 podStartE2EDuration="4.335141204s" podCreationTimestamp="2026-04-20 07:12:34 +0000 UTC" firstStartedPulling="2026-04-20 07:12:35.142943235 +0000 UTC m=+589.963011553" lastFinishedPulling="2026-04-20 07:12:37.436684146 +0000 UTC m=+592.256752467" observedRunningTime="2026-04-20 07:12:38.333510505 +0000 UTC m=+593.153578843" watchObservedRunningTime="2026-04-20 07:12:38.335141204 +0000 UTC m=+593.155209594" Apr 20 07:12:43.984112 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.984074 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d"] Apr 20 07:12:43.987979 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.987897 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:43.991406 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.991383 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 20 07:12:43.991525 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.991469 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 20 07:12:43.992106 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.992088 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 20 07:12:43.992201 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.992119 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 20 07:12:43.992273 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:43.992256 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-tcmrx\"" Apr 20 07:12:44.017080 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.017053 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d"] Apr 20 07:12:44.119658 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.119598 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35209c86-3a05-4072-8e51-0acbba6419bd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.119827 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.119710 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35209c86-3a05-4072-8e51-0acbba6419bd-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.119827 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.119745 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jrcbk\" (UniqueName: \"kubernetes.io/projected/35209c86-3a05-4072-8e51-0acbba6419bd-kube-api-access-jrcbk\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.220754 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.220717 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35209c86-3a05-4072-8e51-0acbba6419bd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.220922 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.220782 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35209c86-3a05-4072-8e51-0acbba6419bd-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.220922 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.220830 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jrcbk\" (UniqueName: \"kubernetes.io/projected/35209c86-3a05-4072-8e51-0acbba6419bd-kube-api-access-jrcbk\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.223269 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.223236 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/35209c86-3a05-4072-8e51-0acbba6419bd-webhook-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.223390 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.223341 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/35209c86-3a05-4072-8e51-0acbba6419bd-apiservice-cert\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.230442 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.230417 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jrcbk\" (UniqueName: \"kubernetes.io/projected/35209c86-3a05-4072-8e51-0acbba6419bd-kube-api-access-jrcbk\") pod \"opendatahub-operator-controller-manager-6d65d76454-l6j6d\" (UID: \"35209c86-3a05-4072-8e51-0acbba6419bd\") " pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.298366 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.298277 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:44.421254 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:44.421220 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d"] Apr 20 07:12:44.424616 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:12:44.424588 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod35209c86_3a05_4072_8e51_0acbba6419bd.slice/crio-8755a3d7e6befad6ee3ef545504480a9fc6ae4405c184b27ca16d5eb05f12f9e WatchSource:0}: Error finding container 8755a3d7e6befad6ee3ef545504480a9fc6ae4405c184b27ca16d5eb05f12f9e: Status 404 returned error can't find the container with id 8755a3d7e6befad6ee3ef545504480a9fc6ae4405c184b27ca16d5eb05f12f9e Apr 20 07:12:45.333984 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:45.333946 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" event={"ID":"35209c86-3a05-4072-8e51-0acbba6419bd","Type":"ContainerStarted","Data":"8755a3d7e6befad6ee3ef545504480a9fc6ae4405c184b27ca16d5eb05f12f9e"} Apr 20 07:12:46.850971 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:46.850947 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:12:46.851378 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:46.850982 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:12:47.340469 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:47.340430 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" event={"ID":"35209c86-3a05-4072-8e51-0acbba6419bd","Type":"ContainerStarted","Data":"ed8cb3929648eccfe9e9cf4c299ab0945202abb32f6ac9939c8bbc0551f31d16"} Apr 20 07:12:47.340798 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:47.340779 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:12:47.364202 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:47.364146 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" podStartSLOduration=1.910163169 podStartE2EDuration="4.364128796s" podCreationTimestamp="2026-04-20 07:12:43 +0000 UTC" firstStartedPulling="2026-04-20 07:12:44.426188419 +0000 UTC m=+599.246256734" lastFinishedPulling="2026-04-20 07:12:46.880154046 +0000 UTC m=+601.700222361" observedRunningTime="2026-04-20 07:12:47.363542536 +0000 UTC m=+602.183610875" watchObservedRunningTime="2026-04-20 07:12:47.364128796 +0000 UTC m=+602.184197136" Apr 20 07:12:49.317593 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:49.317559 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-7fd89bcbc4-vqr2p" Apr 20 07:12:58.345111 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:12:58.345031 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-6d65d76454-l6j6d" Apr 20 07:13:38.606131 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.606095 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf"] Apr 20 07:13:38.613832 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.613801 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.616410 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.616385 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 20 07:13:38.616547 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.616415 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 20 07:13:38.616547 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.616528 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 20 07:13:38.616657 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.616621 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"data-science-gateway-data-science-gateway-class-dockercfg-dkm9v\"" Apr 20 07:13:38.624425 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.624402 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf"] Apr 20 07:13:38.720245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720216 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720245 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720249 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720283 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720319 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720351 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720377 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720459 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720722 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720502 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pglb7\" (UniqueName: \"kubernetes.io/projected/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-kube-api-access-pglb7\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.720722 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.720594 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821224 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821190 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821380 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821240 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821380 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821282 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821380 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821311 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pglb7\" (UniqueName: \"kubernetes.io/projected/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-kube-api-access-pglb7\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821547 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821383 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821547 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821454 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821547 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821482 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821547 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821519 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821545 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821765 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821598 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-credential-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821891 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821863 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-data\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.821947 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.821891 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-workload-certs\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.822150 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.822127 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istiod-ca-cert\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.822230 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.822186 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-workload-socket\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.824248 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.824223 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-envoy\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.824455 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.824435 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-podinfo\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.830752 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.830728 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-istio-token\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.830862 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.830829 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pglb7\" (UniqueName: \"kubernetes.io/projected/f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed-kube-api-access-pglb7\") pod \"data-science-gateway-data-science-gateway-class-55cc67557fj4rwf\" (UID: \"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed\") " pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:38.926405 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:38.926319 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:39.094678 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:39.094623 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf"] Apr 20 07:13:39.097925 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:13:39.097897 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf98f78e6_c615_4e4b_8ec6_0c9361e9f6ed.slice/crio-7f67cff39dd17349b3b7e7893077d950b5493b3968c471e91e60f0c4eb724bce WatchSource:0}: Error finding container 7f67cff39dd17349b3b7e7893077d950b5493b3968c471e91e60f0c4eb724bce: Status 404 returned error can't find the container with id 7f67cff39dd17349b3b7e7893077d950b5493b3968c471e91e60f0c4eb724bce Apr 20 07:13:39.476431 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:39.476397 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" event={"ID":"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed","Type":"ContainerStarted","Data":"7f67cff39dd17349b3b7e7893077d950b5493b3968c471e91e60f0c4eb724bce"} Apr 20 07:13:41.520112 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:41.519444 2543 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 07:13:41.520112 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:41.519516 2543 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 07:13:41.520112 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:41.519545 2543 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 07:13:42.488859 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:42.488825 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" event={"ID":"f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed","Type":"ContainerStarted","Data":"68ede06df5623099aaef55aa13c4f0a7a8ecf99551a5f166fd2e0eb339eb21b2"} Apr 20 07:13:42.509853 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:42.509802 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" podStartSLOduration=2.090408558 podStartE2EDuration="4.509787464s" podCreationTimestamp="2026-04-20 07:13:38 +0000 UTC" firstStartedPulling="2026-04-20 07:13:39.099824624 +0000 UTC m=+653.919892943" lastFinishedPulling="2026-04-20 07:13:41.519203531 +0000 UTC m=+656.339271849" observedRunningTime="2026-04-20 07:13:42.507863316 +0000 UTC m=+657.327931653" watchObservedRunningTime="2026-04-20 07:13:42.509787464 +0000 UTC m=+657.329855800" Apr 20 07:13:42.926858 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:42.926771 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:42.931321 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:42.931299 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:43.492128 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:43.492100 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:13:43.493009 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:13:43.492987 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/data-science-gateway-data-science-gateway-class-55cc67557fj4rwf" Apr 20 07:14:02.578612 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.578579 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bs5nc"] Apr 20 07:14:02.581562 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.581544 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:02.584183 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.584161 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 20 07:14:02.585072 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.585054 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-catalog-dockercfg-dk9sl\"" Apr 20 07:14:02.585148 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.585070 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 20 07:14:02.591705 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.591683 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bs5nc"] Apr 20 07:14:02.711222 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.711186 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt9vr\" (UniqueName: \"kubernetes.io/projected/1b9da49f-1722-40fa-a8ce-422b534b220f-kube-api-access-qt9vr\") pod \"kuadrant-operator-catalog-bs5nc\" (UID: \"1b9da49f-1722-40fa-a8ce-422b534b220f\") " pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:02.812089 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.812035 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qt9vr\" (UniqueName: \"kubernetes.io/projected/1b9da49f-1722-40fa-a8ce-422b534b220f-kube-api-access-qt9vr\") pod \"kuadrant-operator-catalog-bs5nc\" (UID: \"1b9da49f-1722-40fa-a8ce-422b534b220f\") " pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:02.820782 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.820757 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qt9vr\" (UniqueName: \"kubernetes.io/projected/1b9da49f-1722-40fa-a8ce-422b534b220f-kube-api-access-qt9vr\") pod \"kuadrant-operator-catalog-bs5nc\" (UID: \"1b9da49f-1722-40fa-a8ce-422b534b220f\") " pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:02.890969 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.890904 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:02.933303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:02.933272 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bs5nc"] Apr 20 07:14:03.004886 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.004849 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bs5nc"] Apr 20 07:14:03.007843 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:14:03.007813 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b9da49f_1722_40fa_a8ce_422b534b220f.slice/crio-ea44705f61e8270b8175fe3fbb627a47b8e31274622e73b5aa91521324142fd7 WatchSource:0}: Error finding container ea44705f61e8270b8175fe3fbb627a47b8e31274622e73b5aa91521324142fd7: Status 404 returned error can't find the container with id ea44705f61e8270b8175fe3fbb627a47b8e31274622e73b5aa91521324142fd7 Apr 20 07:14:03.145579 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.143475 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-h6lrs"] Apr 20 07:14:03.148965 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.148936 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:03.153067 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.153040 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-h6lrs"] Apr 20 07:14:03.316465 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.316424 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbqgv\" (UniqueName: \"kubernetes.io/projected/ab57ebb1-9bf6-42c6-b548-9656356e12f3-kube-api-access-vbqgv\") pod \"kuadrant-operator-catalog-h6lrs\" (UID: \"ab57ebb1-9bf6-42c6-b548-9656356e12f3\") " pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:03.417755 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.417664 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vbqgv\" (UniqueName: \"kubernetes.io/projected/ab57ebb1-9bf6-42c6-b548-9656356e12f3-kube-api-access-vbqgv\") pod \"kuadrant-operator-catalog-h6lrs\" (UID: \"ab57ebb1-9bf6-42c6-b548-9656356e12f3\") " pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:03.426330 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.426309 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbqgv\" (UniqueName: \"kubernetes.io/projected/ab57ebb1-9bf6-42c6-b548-9656356e12f3-kube-api-access-vbqgv\") pod \"kuadrant-operator-catalog-h6lrs\" (UID: \"ab57ebb1-9bf6-42c6-b548-9656356e12f3\") " pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:03.459233 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.459202 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:03.546058 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.546022 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" event={"ID":"1b9da49f-1722-40fa-a8ce-422b534b220f","Type":"ContainerStarted","Data":"ea44705f61e8270b8175fe3fbb627a47b8e31274622e73b5aa91521324142fd7"} Apr 20 07:14:03.577031 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:03.577002 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-h6lrs"] Apr 20 07:14:03.625788 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:14:03.625752 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab57ebb1_9bf6_42c6_b548_9656356e12f3.slice/crio-5841662e7e881188e86470002eaa27cd4ff04cd4b59ed23e9a781567fbdac39b WatchSource:0}: Error finding container 5841662e7e881188e86470002eaa27cd4ff04cd4b59ed23e9a781567fbdac39b: Status 404 returned error can't find the container with id 5841662e7e881188e86470002eaa27cd4ff04cd4b59ed23e9a781567fbdac39b Apr 20 07:14:04.551382 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:04.551334 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" event={"ID":"ab57ebb1-9bf6-42c6-b548-9656356e12f3","Type":"ContainerStarted","Data":"5841662e7e881188e86470002eaa27cd4ff04cd4b59ed23e9a781567fbdac39b"} Apr 20 07:14:05.557713 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.557679 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" event={"ID":"1b9da49f-1722-40fa-a8ce-422b534b220f","Type":"ContainerStarted","Data":"78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec"} Apr 20 07:14:05.558168 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.557804 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" podUID="1b9da49f-1722-40fa-a8ce-422b534b220f" containerName="registry-server" containerID="cri-o://78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec" gracePeriod=2 Apr 20 07:14:05.559060 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.559023 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" event={"ID":"ab57ebb1-9bf6-42c6-b548-9656356e12f3","Type":"ContainerStarted","Data":"3571e25fed84de7bfc372b84c6156cde73531b1b974eeabcbc819745521c3d13"} Apr 20 07:14:05.573196 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.573147 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" podStartSLOduration=1.439410243 podStartE2EDuration="3.573130003s" podCreationTimestamp="2026-04-20 07:14:02 +0000 UTC" firstStartedPulling="2026-04-20 07:14:03.009195719 +0000 UTC m=+677.829264048" lastFinishedPulling="2026-04-20 07:14:05.142915493 +0000 UTC m=+679.962983808" observedRunningTime="2026-04-20 07:14:05.572493571 +0000 UTC m=+680.392561907" watchObservedRunningTime="2026-04-20 07:14:05.573130003 +0000 UTC m=+680.393198341" Apr 20 07:14:05.799331 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.799310 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:05.814852 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.814762 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" podStartSLOduration=1.296370722 podStartE2EDuration="2.814744431s" podCreationTimestamp="2026-04-20 07:14:03 +0000 UTC" firstStartedPulling="2026-04-20 07:14:03.627131017 +0000 UTC m=+678.447199332" lastFinishedPulling="2026-04-20 07:14:05.145504727 +0000 UTC m=+679.965573041" observedRunningTime="2026-04-20 07:14:05.588543982 +0000 UTC m=+680.408612318" watchObservedRunningTime="2026-04-20 07:14:05.814744431 +0000 UTC m=+680.634812768" Apr 20 07:14:05.838808 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.838785 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qt9vr\" (UniqueName: \"kubernetes.io/projected/1b9da49f-1722-40fa-a8ce-422b534b220f-kube-api-access-qt9vr\") pod \"1b9da49f-1722-40fa-a8ce-422b534b220f\" (UID: \"1b9da49f-1722-40fa-a8ce-422b534b220f\") " Apr 20 07:14:05.840806 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.840782 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b9da49f-1722-40fa-a8ce-422b534b220f-kube-api-access-qt9vr" (OuterVolumeSpecName: "kube-api-access-qt9vr") pod "1b9da49f-1722-40fa-a8ce-422b534b220f" (UID: "1b9da49f-1722-40fa-a8ce-422b534b220f"). InnerVolumeSpecName "kube-api-access-qt9vr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:14:05.939551 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:05.939519 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qt9vr\" (UniqueName: \"kubernetes.io/projected/1b9da49f-1722-40fa-a8ce-422b534b220f-kube-api-access-qt9vr\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:14:06.562970 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.562938 2543 generic.go:358] "Generic (PLEG): container finished" podID="1b9da49f-1722-40fa-a8ce-422b534b220f" containerID="78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec" exitCode=0 Apr 20 07:14:06.563429 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.562998 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" Apr 20 07:14:06.563429 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.563026 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" event={"ID":"1b9da49f-1722-40fa-a8ce-422b534b220f","Type":"ContainerDied","Data":"78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec"} Apr 20 07:14:06.563429 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.563062 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-catalog-bs5nc" event={"ID":"1b9da49f-1722-40fa-a8ce-422b534b220f","Type":"ContainerDied","Data":"ea44705f61e8270b8175fe3fbb627a47b8e31274622e73b5aa91521324142fd7"} Apr 20 07:14:06.563429 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.563077 2543 scope.go:117] "RemoveContainer" containerID="78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec" Apr 20 07:14:06.571333 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.571320 2543 scope.go:117] "RemoveContainer" containerID="78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec" Apr 20 07:14:06.571563 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:14:06.571545 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec\": container with ID starting with 78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec not found: ID does not exist" containerID="78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec" Apr 20 07:14:06.571609 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.571572 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec"} err="failed to get container status \"78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec\": rpc error: code = NotFound desc = could not find container \"78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec\": container with ID starting with 78c714673fc9f5f8db2bb7b25a137e1bd22514fda15d13de5753ed8c6850b4ec not found: ID does not exist" Apr 20 07:14:06.583157 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.583136 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bs5nc"] Apr 20 07:14:06.587110 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:06.587092 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-catalog-bs5nc"] Apr 20 07:14:07.704372 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:07.704338 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b9da49f-1722-40fa-a8ce-422b534b220f" path="/var/lib/kubelet/pods/1b9da49f-1722-40fa-a8ce-422b534b220f/volumes" Apr 20 07:14:13.459585 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:13.459551 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:13.460097 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:13.459697 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:13.480823 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:13.480797 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:13.604249 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:13.604221 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-catalog-h6lrs" Apr 20 07:14:33.158748 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.158665 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-ktkdj"] Apr 20 07:14:33.159113 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.158937 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b9da49f-1722-40fa-a8ce-422b534b220f" containerName="registry-server" Apr 20 07:14:33.159113 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.158947 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b9da49f-1722-40fa-a8ce-422b534b220f" containerName="registry-server" Apr 20 07:14:33.159113 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.158991 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b9da49f-1722-40fa-a8ce-422b534b220f" containerName="registry-server" Apr 20 07:14:33.161610 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.161594 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:14:33.164038 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.164016 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-5khw6\"" Apr 20 07:14:33.175447 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.175426 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-ktkdj"] Apr 20 07:14:33.243893 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.243864 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n92xp\" (UniqueName: \"kubernetes.io/projected/9d93355e-6deb-4c79-9e36-3804fd5e950a-kube-api-access-n92xp\") pod \"authorino-operator-657f44b778-ktkdj\" (UID: \"9d93355e-6deb-4c79-9e36-3804fd5e950a\") " pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:14:33.344654 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.344604 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n92xp\" (UniqueName: \"kubernetes.io/projected/9d93355e-6deb-4c79-9e36-3804fd5e950a-kube-api-access-n92xp\") pod \"authorino-operator-657f44b778-ktkdj\" (UID: \"9d93355e-6deb-4c79-9e36-3804fd5e950a\") " pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:14:33.354812 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.354776 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n92xp\" (UniqueName: \"kubernetes.io/projected/9d93355e-6deb-4c79-9e36-3804fd5e950a-kube-api-access-n92xp\") pod \"authorino-operator-657f44b778-ktkdj\" (UID: \"9d93355e-6deb-4c79-9e36-3804fd5e950a\") " pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:14:33.471053 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.471026 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:14:33.601749 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.601716 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-ktkdj"] Apr 20 07:14:33.605115 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:14:33.605088 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d93355e_6deb_4c79_9e36_3804fd5e950a.slice/crio-920ff3643b4134ddc80fd7a95f38adb86b009c23bc9e042134ca05739ee33600 WatchSource:0}: Error finding container 920ff3643b4134ddc80fd7a95f38adb86b009c23bc9e042134ca05739ee33600: Status 404 returned error can't find the container with id 920ff3643b4134ddc80fd7a95f38adb86b009c23bc9e042134ca05739ee33600 Apr 20 07:14:33.641708 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:33.641672 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" event={"ID":"9d93355e-6deb-4c79-9e36-3804fd5e950a","Type":"ContainerStarted","Data":"920ff3643b4134ddc80fd7a95f38adb86b009c23bc9e042134ca05739ee33600"} Apr 20 07:14:36.653943 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:36.653902 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" event={"ID":"9d93355e-6deb-4c79-9e36-3804fd5e950a","Type":"ContainerStarted","Data":"0e860456c8c8431e577223c2d25ff58452a3feb612b14202a7328ef0736df239"} Apr 20 07:14:36.654324 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:36.654003 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:14:36.675988 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:36.675945 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" podStartSLOduration=1.57818912 podStartE2EDuration="3.675929569s" podCreationTimestamp="2026-04-20 07:14:33 +0000 UTC" firstStartedPulling="2026-04-20 07:14:33.60706029 +0000 UTC m=+708.427128605" lastFinishedPulling="2026-04-20 07:14:35.704800736 +0000 UTC m=+710.524869054" observedRunningTime="2026-04-20 07:14:36.673703551 +0000 UTC m=+711.493771889" watchObservedRunningTime="2026-04-20 07:14:36.675929569 +0000 UTC m=+711.495997906" Apr 20 07:14:47.659071 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:14:47.659036 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-ktkdj" Apr 20 07:15:19.832712 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.832632 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4"] Apr 20 07:15:19.839539 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.839514 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.844467 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.844444 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"maas-default-gateway-openshift-default-dockercfg-2xjt2\"" Apr 20 07:15:19.849158 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.849135 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4"] Apr 20 07:15:19.913602 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913572 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913750 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913607 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913750 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913664 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913750 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913691 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/73906400-114f-406d-aa5d-b27617fa0457-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913750 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913709 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/73906400-114f-406d-aa5d-b27617fa0457-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913888 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913766 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913888 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913798 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/73906400-114f-406d-aa5d-b27617fa0457-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913888 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913819 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:19.913888 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:19.913834 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92k75\" (UniqueName: \"kubernetes.io/projected/73906400-114f-406d-aa5d-b27617fa0457-kube-api-access-92k75\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.014776 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.014720 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.014930 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.014795 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/73906400-114f-406d-aa5d-b27617fa0457-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.014930 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.014818 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/73906400-114f-406d-aa5d-b27617fa0457-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.014930 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.014845 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.014930 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.014868 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/73906400-114f-406d-aa5d-b27617fa0457-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015084 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.014997 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015084 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015042 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-92k75\" (UniqueName: \"kubernetes.io/projected/73906400-114f-406d-aa5d-b27617fa0457-kube-api-access-92k75\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015084 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015076 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015230 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015109 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015230 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015172 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-credential-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015449 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015426 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-istio-data\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015446 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-workload-certs\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015474 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-workload-socket\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.015816 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.015801 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/73906400-114f-406d-aa5d-b27617fa0457-istiod-ca-cert\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.017093 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.017072 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/73906400-114f-406d-aa5d-b27617fa0457-istio-envoy\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.017402 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.017383 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/73906400-114f-406d-aa5d-b27617fa0457-istio-podinfo\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.025320 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.025297 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-92k75\" (UniqueName: \"kubernetes.io/projected/73906400-114f-406d-aa5d-b27617fa0457-kube-api-access-92k75\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.025475 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.025458 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/73906400-114f-406d-aa5d-b27617fa0457-istio-token\") pod \"maas-default-gateway-openshift-default-845c6b4b48-wvtr4\" (UID: \"73906400-114f-406d-aa5d-b27617fa0457\") " pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.151832 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.151745 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:20.279378 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.279354 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4"] Apr 20 07:15:20.281146 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:15:20.281122 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73906400_114f_406d_aa5d_b27617fa0457.slice/crio-6412dc0ffb121dbf55d8cc0d5c094c009c89835b1e0c6a0329a0216522d2cd01 WatchSource:0}: Error finding container 6412dc0ffb121dbf55d8cc0d5c094c009c89835b1e0c6a0329a0216522d2cd01: Status 404 returned error can't find the container with id 6412dc0ffb121dbf55d8cc0d5c094c009c89835b1e0c6a0329a0216522d2cd01 Apr 20 07:15:20.283096 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.283062 2543 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 07:15:20.283186 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.283136 2543 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 07:15:20.283248 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.283183 2543 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236232Ki","pods":"250"} Apr 20 07:15:20.781450 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.781416 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" event={"ID":"73906400-114f-406d-aa5d-b27617fa0457","Type":"ContainerStarted","Data":"9d805ef243091d29da714f5fd851c7c58f785420d4c8b3d5832137ed8c30cbce"} Apr 20 07:15:20.781450 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.781449 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" event={"ID":"73906400-114f-406d-aa5d-b27617fa0457","Type":"ContainerStarted","Data":"6412dc0ffb121dbf55d8cc0d5c094c009c89835b1e0c6a0329a0216522d2cd01"} Apr 20 07:15:20.802742 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:20.802683 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" podStartSLOduration=1.802667182 podStartE2EDuration="1.802667182s" podCreationTimestamp="2026-04-20 07:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:15:20.801330596 +0000 UTC m=+755.621398934" watchObservedRunningTime="2026-04-20 07:15:20.802667182 +0000 UTC m=+755.622735519" Apr 20 07:15:21.152243 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:21.152154 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:21.156979 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:21.156957 2543 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:21.784860 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:21.784832 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:21.785885 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:21.785861 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/maas-default-gateway-openshift-default-845c6b4b48-wvtr4" Apr 20 07:15:24.061143 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.061105 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:24.064312 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.064297 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.066840 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.066820 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 20 07:15:24.067052 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.067037 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-4sdjw\"" Apr 20 07:15:24.079872 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.079850 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:24.150298 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.150269 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bde2c79-5887-4d10-962b-babcdc483305-config-file\") pod \"limitador-limitador-7d549b5b-wvlwd\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.150474 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.150325 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96jd6\" (UniqueName: \"kubernetes.io/projected/6bde2c79-5887-4d10-962b-babcdc483305-kube-api-access-96jd6\") pod \"limitador-limitador-7d549b5b-wvlwd\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.163896 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.163861 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:24.250967 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.250936 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-96jd6\" (UniqueName: \"kubernetes.io/projected/6bde2c79-5887-4d10-962b-babcdc483305-kube-api-access-96jd6\") pod \"limitador-limitador-7d549b5b-wvlwd\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.251125 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.251007 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bde2c79-5887-4d10-962b-babcdc483305-config-file\") pod \"limitador-limitador-7d549b5b-wvlwd\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.251566 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.251549 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bde2c79-5887-4d10-962b-babcdc483305-config-file\") pod \"limitador-limitador-7d549b5b-wvlwd\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.260716 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.260691 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-96jd6\" (UniqueName: \"kubernetes.io/projected/6bde2c79-5887-4d10-962b-babcdc483305-kube-api-access-96jd6\") pod \"limitador-limitador-7d549b5b-wvlwd\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.374663 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.374552 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:24.512772 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.512590 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:24.515375 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:15:24.515349 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bde2c79_5887_4d10_962b_babcdc483305.slice/crio-cf8423c73e953e6b7db883588ce00c482ebf596b5d92006964988637ab2db3dd WatchSource:0}: Error finding container cf8423c73e953e6b7db883588ce00c482ebf596b5d92006964988637ab2db3dd: Status 404 returned error can't find the container with id cf8423c73e953e6b7db883588ce00c482ebf596b5d92006964988637ab2db3dd Apr 20 07:15:24.794939 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.794906 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" event={"ID":"6bde2c79-5887-4d10-962b-babcdc483305","Type":"ContainerStarted","Data":"cf8423c73e953e6b7db883588ce00c482ebf596b5d92006964988637ab2db3dd"} Apr 20 07:15:24.926726 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.926694 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8jmcp"] Apr 20 07:15:24.931160 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.931143 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:24.934056 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.934032 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6clll\"" Apr 20 07:15:24.941379 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.941357 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8jmcp"] Apr 20 07:15:24.956652 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:24.956606 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdq8b\" (UniqueName: \"kubernetes.io/projected/762faaf5-5386-4c3b-9730-c721c0fe3429-kube-api-access-pdq8b\") pod \"authorino-f99f4b5cd-8jmcp\" (UID: \"762faaf5-5386-4c3b-9730-c721c0fe3429\") " pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:25.058036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:25.057945 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdq8b\" (UniqueName: \"kubernetes.io/projected/762faaf5-5386-4c3b-9730-c721c0fe3429-kube-api-access-pdq8b\") pod \"authorino-f99f4b5cd-8jmcp\" (UID: \"762faaf5-5386-4c3b-9730-c721c0fe3429\") " pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:25.068938 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:25.068907 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdq8b\" (UniqueName: \"kubernetes.io/projected/762faaf5-5386-4c3b-9730-c721c0fe3429-kube-api-access-pdq8b\") pod \"authorino-f99f4b5cd-8jmcp\" (UID: \"762faaf5-5386-4c3b-9730-c721c0fe3429\") " pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:25.240270 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:25.240234 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:25.406500 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:25.406425 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8jmcp"] Apr 20 07:15:25.413203 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:15:25.412321 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod762faaf5_5386_4c3b_9730_c721c0fe3429.slice/crio-aad0bca06e3497f22338b1c826188b4818a6609747db922b49167dac7dc17037 WatchSource:0}: Error finding container aad0bca06e3497f22338b1c826188b4818a6609747db922b49167dac7dc17037: Status 404 returned error can't find the container with id aad0bca06e3497f22338b1c826188b4818a6609747db922b49167dac7dc17037 Apr 20 07:15:25.799956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:25.799920 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" event={"ID":"762faaf5-5386-4c3b-9730-c721c0fe3429","Type":"ContainerStarted","Data":"aad0bca06e3497f22338b1c826188b4818a6609747db922b49167dac7dc17037"} Apr 20 07:15:28.810312 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:28.810274 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" event={"ID":"6bde2c79-5887-4d10-962b-babcdc483305","Type":"ContainerStarted","Data":"c1459359629cc80999780a886b301024de5a6cd8caf624efa3bde810e33508e9"} Apr 20 07:15:28.810752 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:28.810418 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:28.811567 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:28.811540 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" event={"ID":"762faaf5-5386-4c3b-9730-c721c0fe3429","Type":"ContainerStarted","Data":"0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2"} Apr 20 07:15:28.835436 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:28.835397 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" podStartSLOduration=1.054054863 podStartE2EDuration="4.835386157s" podCreationTimestamp="2026-04-20 07:15:24 +0000 UTC" firstStartedPulling="2026-04-20 07:15:24.517799479 +0000 UTC m=+759.337867812" lastFinishedPulling="2026-04-20 07:15:28.299130776 +0000 UTC m=+763.119199106" observedRunningTime="2026-04-20 07:15:28.834853765 +0000 UTC m=+763.654922101" watchObservedRunningTime="2026-04-20 07:15:28.835386157 +0000 UTC m=+763.655454494" Apr 20 07:15:28.852518 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:28.852484 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" podStartSLOduration=1.964108665 podStartE2EDuration="4.852472755s" podCreationTimestamp="2026-04-20 07:15:24 +0000 UTC" firstStartedPulling="2026-04-20 07:15:25.414286055 +0000 UTC m=+760.234354374" lastFinishedPulling="2026-04-20 07:15:28.302650136 +0000 UTC m=+763.122718464" observedRunningTime="2026-04-20 07:15:28.851780539 +0000 UTC m=+763.671848876" watchObservedRunningTime="2026-04-20 07:15:28.852472755 +0000 UTC m=+763.672541075" Apr 20 07:15:29.224122 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:29.224087 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8jmcp"] Apr 20 07:15:30.817446 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:30.817406 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" podUID="762faaf5-5386-4c3b-9730-c721c0fe3429" containerName="authorino" containerID="cri-o://0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2" gracePeriod=30 Apr 20 07:15:31.059419 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.059396 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:31.108405 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.108334 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdq8b\" (UniqueName: \"kubernetes.io/projected/762faaf5-5386-4c3b-9730-c721c0fe3429-kube-api-access-pdq8b\") pod \"762faaf5-5386-4c3b-9730-c721c0fe3429\" (UID: \"762faaf5-5386-4c3b-9730-c721c0fe3429\") " Apr 20 07:15:31.110343 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.110310 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/762faaf5-5386-4c3b-9730-c721c0fe3429-kube-api-access-pdq8b" (OuterVolumeSpecName: "kube-api-access-pdq8b") pod "762faaf5-5386-4c3b-9730-c721c0fe3429" (UID: "762faaf5-5386-4c3b-9730-c721c0fe3429"). InnerVolumeSpecName "kube-api-access-pdq8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:15:31.209333 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.209302 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdq8b\" (UniqueName: \"kubernetes.io/projected/762faaf5-5386-4c3b-9730-c721c0fe3429-kube-api-access-pdq8b\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:15:31.820738 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.820706 2543 generic.go:358] "Generic (PLEG): container finished" podID="762faaf5-5386-4c3b-9730-c721c0fe3429" containerID="0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2" exitCode=0 Apr 20 07:15:31.821167 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.820756 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" Apr 20 07:15:31.821167 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.820784 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" event={"ID":"762faaf5-5386-4c3b-9730-c721c0fe3429","Type":"ContainerDied","Data":"0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2"} Apr 20 07:15:31.821167 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.820821 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-f99f4b5cd-8jmcp" event={"ID":"762faaf5-5386-4c3b-9730-c721c0fe3429","Type":"ContainerDied","Data":"aad0bca06e3497f22338b1c826188b4818a6609747db922b49167dac7dc17037"} Apr 20 07:15:31.821167 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.820836 2543 scope.go:117] "RemoveContainer" containerID="0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2" Apr 20 07:15:31.828604 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.828563 2543 scope.go:117] "RemoveContainer" containerID="0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2" Apr 20 07:15:31.828834 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:15:31.828816 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2\": container with ID starting with 0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2 not found: ID does not exist" containerID="0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2" Apr 20 07:15:31.828885 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.828844 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2"} err="failed to get container status \"0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2\": rpc error: code = NotFound desc = could not find container \"0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2\": container with ID starting with 0eea1fe0ea6d38c29283938a400e2c121b37ef6f2773156e0521de58a05500c2 not found: ID does not exist" Apr 20 07:15:31.839750 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.839725 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8jmcp"] Apr 20 07:15:31.843480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:31.843457 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-f99f4b5cd-8jmcp"] Apr 20 07:15:33.701595 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:33.701564 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="762faaf5-5386-4c3b-9730-c721c0fe3429" path="/var/lib/kubelet/pods/762faaf5-5386-4c3b-9730-c721c0fe3429/volumes" Apr 20 07:15:39.815790 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:39.815759 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:40.163548 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:40.163463 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:40.163727 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:40.163699 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" podUID="6bde2c79-5887-4d10-962b-babcdc483305" containerName="limitador" containerID="cri-o://c1459359629cc80999780a886b301024de5a6cd8caf624efa3bde810e33508e9" gracePeriod=30 Apr 20 07:15:40.850524 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:40.850443 2543 generic.go:358] "Generic (PLEG): container finished" podID="6bde2c79-5887-4d10-962b-babcdc483305" containerID="c1459359629cc80999780a886b301024de5a6cd8caf624efa3bde810e33508e9" exitCode=0 Apr 20 07:15:40.850524 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:40.850495 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" event={"ID":"6bde2c79-5887-4d10-962b-babcdc483305","Type":"ContainerDied","Data":"c1459359629cc80999780a886b301024de5a6cd8caf624efa3bde810e33508e9"} Apr 20 07:15:41.096378 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.096355 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:41.189762 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.189732 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-96jd6\" (UniqueName: \"kubernetes.io/projected/6bde2c79-5887-4d10-962b-babcdc483305-kube-api-access-96jd6\") pod \"6bde2c79-5887-4d10-962b-babcdc483305\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " Apr 20 07:15:41.189903 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.189782 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bde2c79-5887-4d10-962b-babcdc483305-config-file\") pod \"6bde2c79-5887-4d10-962b-babcdc483305\" (UID: \"6bde2c79-5887-4d10-962b-babcdc483305\") " Apr 20 07:15:41.190146 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.190124 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bde2c79-5887-4d10-962b-babcdc483305-config-file" (OuterVolumeSpecName: "config-file") pod "6bde2c79-5887-4d10-962b-babcdc483305" (UID: "6bde2c79-5887-4d10-962b-babcdc483305"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 20 07:15:41.191819 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.191798 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bde2c79-5887-4d10-962b-babcdc483305-kube-api-access-96jd6" (OuterVolumeSpecName: "kube-api-access-96jd6") pod "6bde2c79-5887-4d10-962b-babcdc483305" (UID: "6bde2c79-5887-4d10-962b-babcdc483305"). InnerVolumeSpecName "kube-api-access-96jd6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:15:41.290297 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.290263 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-96jd6\" (UniqueName: \"kubernetes.io/projected/6bde2c79-5887-4d10-962b-babcdc483305-kube-api-access-96jd6\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:15:41.290297 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.290294 2543 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/6bde2c79-5887-4d10-962b-babcdc483305-config-file\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:15:41.854507 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.854472 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" event={"ID":"6bde2c79-5887-4d10-962b-babcdc483305","Type":"ContainerDied","Data":"cf8423c73e953e6b7db883588ce00c482ebf596b5d92006964988637ab2db3dd"} Apr 20 07:15:41.854507 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.854509 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-wvlwd" Apr 20 07:15:41.854976 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.854523 2543 scope.go:117] "RemoveContainer" containerID="c1459359629cc80999780a886b301024de5a6cd8caf624efa3bde810e33508e9" Apr 20 07:15:41.872680 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.872654 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:41.875124 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:41.875104 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-wvlwd"] Apr 20 07:15:43.701876 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:43.701841 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bde2c79-5887-4d10-962b-babcdc483305" path="/var/lib/kubelet/pods/6bde2c79-5887-4d10-962b-babcdc483305/volumes" Apr 20 07:15:58.338028 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.337944 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-g58sd"] Apr 20 07:15:58.338587 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.338402 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6bde2c79-5887-4d10-962b-babcdc483305" containerName="limitador" Apr 20 07:15:58.338587 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.338422 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bde2c79-5887-4d10-962b-babcdc483305" containerName="limitador" Apr 20 07:15:58.338587 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.338451 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="762faaf5-5386-4c3b-9730-c721c0fe3429" containerName="authorino" Apr 20 07:15:58.338587 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.338460 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="762faaf5-5386-4c3b-9730-c721c0fe3429" containerName="authorino" Apr 20 07:15:58.338827 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.338610 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="6bde2c79-5887-4d10-962b-babcdc483305" containerName="limitador" Apr 20 07:15:58.338827 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.338624 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="762faaf5-5386-4c3b-9730-c721c0fe3429" containerName="authorino" Apr 20 07:15:58.346440 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.346390 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:58.347994 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.347968 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-g58sd"] Apr 20 07:15:58.349030 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.349010 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-6clll\"" Apr 20 07:15:58.429205 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.429163 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nkbn\" (UniqueName: \"kubernetes.io/projected/60baacc8-8d03-4d52-a463-24c10ba668f9-kube-api-access-2nkbn\") pod \"authorino-8b475cf9f-g58sd\" (UID: \"60baacc8-8d03-4d52-a463-24c10ba668f9\") " pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:58.530132 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.530101 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2nkbn\" (UniqueName: \"kubernetes.io/projected/60baacc8-8d03-4d52-a463-24c10ba668f9-kube-api-access-2nkbn\") pod \"authorino-8b475cf9f-g58sd\" (UID: \"60baacc8-8d03-4d52-a463-24c10ba668f9\") " pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:58.530296 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.530191 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-g58sd"] Apr 20 07:15:58.530403 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:15:58.530381 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-2nkbn], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-8b475cf9f-g58sd" podUID="60baacc8-8d03-4d52-a463-24c10ba668f9" Apr 20 07:15:58.550929 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.550896 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nkbn\" (UniqueName: \"kubernetes.io/projected/60baacc8-8d03-4d52-a463-24c10ba668f9-kube-api-access-2nkbn\") pod \"authorino-8b475cf9f-g58sd\" (UID: \"60baacc8-8d03-4d52-a463-24c10ba668f9\") " pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:58.586592 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.586556 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-56fdd757f5-kwjnc"] Apr 20 07:15:58.590075 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.590026 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.593279 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.593259 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 20 07:15:58.615614 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.615584 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-kwjnc"] Apr 20 07:15:58.721870 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.721836 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-kwjnc"] Apr 20 07:15:58.722072 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:15:58.722053 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-vmbcn tls-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="kuadrant-system/authorino-56fdd757f5-kwjnc" podUID="1626e385-c672-4d9f-938b-6c50498d2060" Apr 20 07:15:58.731461 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.731434 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmbcn\" (UniqueName: \"kubernetes.io/projected/1626e385-c672-4d9f-938b-6c50498d2060-kube-api-access-vmbcn\") pod \"authorino-56fdd757f5-kwjnc\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.731565 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.731543 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1626e385-c672-4d9f-938b-6c50498d2060-tls-cert\") pod \"authorino-56fdd757f5-kwjnc\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.756910 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.756887 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-745b8f576-lkbm6"] Apr 20 07:15:58.760213 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.760199 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:58.782301 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.782273 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-745b8f576-lkbm6"] Apr 20 07:15:58.832764 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.832732 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-tls-cert\") pod \"authorino-745b8f576-lkbm6\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:58.832870 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.832772 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1626e385-c672-4d9f-938b-6c50498d2060-tls-cert\") pod \"authorino-56fdd757f5-kwjnc\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.832870 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.832793 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tc2k\" (UniqueName: \"kubernetes.io/projected/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-kube-api-access-7tc2k\") pod \"authorino-745b8f576-lkbm6\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:58.832945 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.832864 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vmbcn\" (UniqueName: \"kubernetes.io/projected/1626e385-c672-4d9f-938b-6c50498d2060-kube-api-access-vmbcn\") pod \"authorino-56fdd757f5-kwjnc\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.835117 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.835096 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1626e385-c672-4d9f-938b-6c50498d2060-tls-cert\") pod \"authorino-56fdd757f5-kwjnc\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.846190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.846138 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmbcn\" (UniqueName: \"kubernetes.io/projected/1626e385-c672-4d9f-938b-6c50498d2060-kube-api-access-vmbcn\") pod \"authorino-56fdd757f5-kwjnc\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.903938 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.903914 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:58.904041 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.903918 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.908538 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.908515 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:58.911810 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.911792 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:58.933773 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.933751 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-tls-cert\") pod \"authorino-745b8f576-lkbm6\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:58.933866 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.933790 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tc2k\" (UniqueName: \"kubernetes.io/projected/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-kube-api-access-7tc2k\") pod \"authorino-745b8f576-lkbm6\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:58.936097 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.936075 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-tls-cert\") pod \"authorino-745b8f576-lkbm6\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:58.942384 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:58.942359 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tc2k\" (UniqueName: \"kubernetes.io/projected/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-kube-api-access-7tc2k\") pod \"authorino-745b8f576-lkbm6\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:59.035036 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.035008 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmbcn\" (UniqueName: \"kubernetes.io/projected/1626e385-c672-4d9f-938b-6c50498d2060-kube-api-access-vmbcn\") pod \"1626e385-c672-4d9f-938b-6c50498d2060\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " Apr 20 07:15:59.035177 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.035048 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1626e385-c672-4d9f-938b-6c50498d2060-tls-cert\") pod \"1626e385-c672-4d9f-938b-6c50498d2060\" (UID: \"1626e385-c672-4d9f-938b-6c50498d2060\") " Apr 20 07:15:59.035177 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.035090 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nkbn\" (UniqueName: \"kubernetes.io/projected/60baacc8-8d03-4d52-a463-24c10ba668f9-kube-api-access-2nkbn\") pod \"60baacc8-8d03-4d52-a463-24c10ba668f9\" (UID: \"60baacc8-8d03-4d52-a463-24c10ba668f9\") " Apr 20 07:15:59.037224 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.037198 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1626e385-c672-4d9f-938b-6c50498d2060-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "1626e385-c672-4d9f-938b-6c50498d2060" (UID: "1626e385-c672-4d9f-938b-6c50498d2060"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:15:59.037325 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.037238 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60baacc8-8d03-4d52-a463-24c10ba668f9-kube-api-access-2nkbn" (OuterVolumeSpecName: "kube-api-access-2nkbn") pod "60baacc8-8d03-4d52-a463-24c10ba668f9" (UID: "60baacc8-8d03-4d52-a463-24c10ba668f9"). InnerVolumeSpecName "kube-api-access-2nkbn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:15:59.037325 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.037242 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1626e385-c672-4d9f-938b-6c50498d2060-kube-api-access-vmbcn" (OuterVolumeSpecName: "kube-api-access-vmbcn") pod "1626e385-c672-4d9f-938b-6c50498d2060" (UID: "1626e385-c672-4d9f-938b-6c50498d2060"). InnerVolumeSpecName "kube-api-access-vmbcn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:15:59.068633 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.068607 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:15:59.135800 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.135772 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2nkbn\" (UniqueName: \"kubernetes.io/projected/60baacc8-8d03-4d52-a463-24c10ba668f9-kube-api-access-2nkbn\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:15:59.135800 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.135799 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vmbcn\" (UniqueName: \"kubernetes.io/projected/1626e385-c672-4d9f-938b-6c50498d2060-kube-api-access-vmbcn\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:15:59.135959 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.135812 2543 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/1626e385-c672-4d9f-938b-6c50498d2060-tls-cert\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:15:59.190469 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.190325 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-745b8f576-lkbm6"] Apr 20 07:15:59.193121 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:15:59.193096 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8cdec68_d7cf_492f_ae68_ed8c1f894f8a.slice/crio-eadaf367546a1907252ec8a1fe773c160daa94c81cf78641a1c0c9f106b4f1e9 WatchSource:0}: Error finding container eadaf367546a1907252ec8a1fe773c160daa94c81cf78641a1c0c9f106b4f1e9: Status 404 returned error can't find the container with id eadaf367546a1907252ec8a1fe773c160daa94c81cf78641a1c0c9f106b4f1e9 Apr 20 07:15:59.908630 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.908538 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-745b8f576-lkbm6" event={"ID":"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a","Type":"ContainerStarted","Data":"f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee"} Apr 20 07:15:59.908630 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.908568 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-56fdd757f5-kwjnc" Apr 20 07:15:59.908630 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.908582 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-745b8f576-lkbm6" event={"ID":"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a","Type":"ContainerStarted","Data":"eadaf367546a1907252ec8a1fe773c160daa94c81cf78641a1c0c9f106b4f1e9"} Apr 20 07:15:59.908630 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.908559 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-g58sd" Apr 20 07:15:59.925253 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.925204 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-745b8f576-lkbm6" podStartSLOduration=1.511909185 podStartE2EDuration="1.925190858s" podCreationTimestamp="2026-04-20 07:15:58 +0000 UTC" firstStartedPulling="2026-04-20 07:15:59.194322858 +0000 UTC m=+794.014391173" lastFinishedPulling="2026-04-20 07:15:59.607604527 +0000 UTC m=+794.427672846" observedRunningTime="2026-04-20 07:15:59.923696864 +0000 UTC m=+794.743765192" watchObservedRunningTime="2026-04-20 07:15:59.925190858 +0000 UTC m=+794.745259194" Apr 20 07:15:59.949717 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.949688 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-g58sd"] Apr 20 07:15:59.964601 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.964574 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-g58sd"] Apr 20 07:15:59.994945 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.994912 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-kwjnc"] Apr 20 07:15:59.999205 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:15:59.999178 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-56fdd757f5-kwjnc"] Apr 20 07:16:00.851969 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:00.851935 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-p6vfd"] Apr 20 07:16:00.855121 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:00.855104 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:00.858767 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:00.858745 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-hrfk2\"" Apr 20 07:16:00.868116 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:00.868091 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-p6vfd"] Apr 20 07:16:00.951514 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:00.951474 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldc4v\" (UniqueName: \"kubernetes.io/projected/d4e43346-3c6a-4e7a-9008-60344bb0b0c8-kube-api-access-ldc4v\") pod \"maas-controller-6d4c8f55f9-p6vfd\" (UID: \"d4e43346-3c6a-4e7a-9008-60344bb0b0c8\") " pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:01.002836 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.002806 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-5b8f949ccf-wxs9r"] Apr 20 07:16:01.005925 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.005909 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:01.015114 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.015087 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-5b8f949ccf-wxs9r"] Apr 20 07:16:01.052183 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.052148 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldc4v\" (UniqueName: \"kubernetes.io/projected/d4e43346-3c6a-4e7a-9008-60344bb0b0c8-kube-api-access-ldc4v\") pod \"maas-controller-6d4c8f55f9-p6vfd\" (UID: \"d4e43346-3c6a-4e7a-9008-60344bb0b0c8\") " pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:01.062411 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.062389 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldc4v\" (UniqueName: \"kubernetes.io/projected/d4e43346-3c6a-4e7a-9008-60344bb0b0c8-kube-api-access-ldc4v\") pod \"maas-controller-6d4c8f55f9-p6vfd\" (UID: \"d4e43346-3c6a-4e7a-9008-60344bb0b0c8\") " pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:01.110525 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.110442 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5b8f949ccf-wxs9r"] Apr 20 07:16:01.110709 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:16:01.110688 2543 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-wh78q], unattached volumes=[], failed to process volumes=[]: context canceled" pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" podUID="df74b8e6-2a75-4a9a-aa18-addb2f056254" Apr 20 07:16:01.147405 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.147365 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66594bd994-lqkd8"] Apr 20 07:16:01.150725 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.150703 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:01.153104 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.153080 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wh78q\" (UniqueName: \"kubernetes.io/projected/df74b8e6-2a75-4a9a-aa18-addb2f056254-kube-api-access-wh78q\") pod \"maas-controller-5b8f949ccf-wxs9r\" (UID: \"df74b8e6-2a75-4a9a-aa18-addb2f056254\") " pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:01.161538 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.161503 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66594bd994-lqkd8"] Apr 20 07:16:01.164224 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.164203 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:01.254315 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.254284 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8ncc\" (UniqueName: \"kubernetes.io/projected/a08577d2-082f-481c-ad89-729b2d38d302-kube-api-access-z8ncc\") pod \"maas-controller-66594bd994-lqkd8\" (UID: \"a08577d2-082f-481c-ad89-729b2d38d302\") " pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:01.254477 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.254351 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wh78q\" (UniqueName: \"kubernetes.io/projected/df74b8e6-2a75-4a9a-aa18-addb2f056254-kube-api-access-wh78q\") pod \"maas-controller-5b8f949ccf-wxs9r\" (UID: \"df74b8e6-2a75-4a9a-aa18-addb2f056254\") " pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:01.264172 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.264144 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wh78q\" (UniqueName: \"kubernetes.io/projected/df74b8e6-2a75-4a9a-aa18-addb2f056254-kube-api-access-wh78q\") pod \"maas-controller-5b8f949ccf-wxs9r\" (UID: \"df74b8e6-2a75-4a9a-aa18-addb2f056254\") " pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:01.280002 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.279977 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-p6vfd"] Apr 20 07:16:01.282072 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:16:01.282048 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4e43346_3c6a_4e7a_9008_60344bb0b0c8.slice/crio-ac0308dc303c5573db5df5617020a916820c5ae5c8afb2d39fb747854492f5a3 WatchSource:0}: Error finding container ac0308dc303c5573db5df5617020a916820c5ae5c8afb2d39fb747854492f5a3: Status 404 returned error can't find the container with id ac0308dc303c5573db5df5617020a916820c5ae5c8afb2d39fb747854492f5a3 Apr 20 07:16:01.355208 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.355174 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z8ncc\" (UniqueName: \"kubernetes.io/projected/a08577d2-082f-481c-ad89-729b2d38d302-kube-api-access-z8ncc\") pod \"maas-controller-66594bd994-lqkd8\" (UID: \"a08577d2-082f-481c-ad89-729b2d38d302\") " pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:01.364123 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.364055 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8ncc\" (UniqueName: \"kubernetes.io/projected/a08577d2-082f-481c-ad89-729b2d38d302-kube-api-access-z8ncc\") pod \"maas-controller-66594bd994-lqkd8\" (UID: \"a08577d2-082f-481c-ad89-729b2d38d302\") " pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:01.460464 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.460428 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:01.574389 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.574364 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66594bd994-lqkd8"] Apr 20 07:16:01.576790 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:16:01.576760 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda08577d2_082f_481c_ad89_729b2d38d302.slice/crio-bfa3fa5d236e987f1c8f2d0ef79cd381324c296ebe00d007274972d70bd60341 WatchSource:0}: Error finding container bfa3fa5d236e987f1c8f2d0ef79cd381324c296ebe00d007274972d70bd60341: Status 404 returned error can't find the container with id bfa3fa5d236e987f1c8f2d0ef79cd381324c296ebe00d007274972d70bd60341 Apr 20 07:16:01.701329 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.701295 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1626e385-c672-4d9f-938b-6c50498d2060" path="/var/lib/kubelet/pods/1626e385-c672-4d9f-938b-6c50498d2060/volumes" Apr 20 07:16:01.702185 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.702023 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60baacc8-8d03-4d52-a463-24c10ba668f9" path="/var/lib/kubelet/pods/60baacc8-8d03-4d52-a463-24c10ba668f9/volumes" Apr 20 07:16:01.916574 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.916516 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66594bd994-lqkd8" event={"ID":"a08577d2-082f-481c-ad89-729b2d38d302","Type":"ContainerStarted","Data":"bfa3fa5d236e987f1c8f2d0ef79cd381324c296ebe00d007274972d70bd60341"} Apr 20 07:16:01.918048 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.917994 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" event={"ID":"d4e43346-3c6a-4e7a-9008-60344bb0b0c8","Type":"ContainerStarted","Data":"ac0308dc303c5573db5df5617020a916820c5ae5c8afb2d39fb747854492f5a3"} Apr 20 07:16:01.918048 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.918036 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:01.924611 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:01.924575 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:02.061355 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:02.061273 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wh78q\" (UniqueName: \"kubernetes.io/projected/df74b8e6-2a75-4a9a-aa18-addb2f056254-kube-api-access-wh78q\") pod \"df74b8e6-2a75-4a9a-aa18-addb2f056254\" (UID: \"df74b8e6-2a75-4a9a-aa18-addb2f056254\") " Apr 20 07:16:02.064177 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:02.064129 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df74b8e6-2a75-4a9a-aa18-addb2f056254-kube-api-access-wh78q" (OuterVolumeSpecName: "kube-api-access-wh78q") pod "df74b8e6-2a75-4a9a-aa18-addb2f056254" (UID: "df74b8e6-2a75-4a9a-aa18-addb2f056254"). InnerVolumeSpecName "kube-api-access-wh78q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:16:02.162520 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:02.162470 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wh78q\" (UniqueName: \"kubernetes.io/projected/df74b8e6-2a75-4a9a-aa18-addb2f056254-kube-api-access-wh78q\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:16:02.921264 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:02.921141 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-5b8f949ccf-wxs9r" Apr 20 07:16:02.955618 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:02.955553 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-5b8f949ccf-wxs9r"] Apr 20 07:16:02.961918 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:02.961889 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-5b8f949ccf-wxs9r"] Apr 20 07:16:03.702988 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:03.702954 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df74b8e6-2a75-4a9a-aa18-addb2f056254" path="/var/lib/kubelet/pods/df74b8e6-2a75-4a9a-aa18-addb2f056254/volumes" Apr 20 07:16:04.928723 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:04.928687 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66594bd994-lqkd8" event={"ID":"a08577d2-082f-481c-ad89-729b2d38d302","Type":"ContainerStarted","Data":"38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76"} Apr 20 07:16:04.929149 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:04.928854 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:04.930002 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:04.929981 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" event={"ID":"d4e43346-3c6a-4e7a-9008-60344bb0b0c8","Type":"ContainerStarted","Data":"03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04"} Apr 20 07:16:04.930090 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:04.930078 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:04.953595 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:04.953555 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66594bd994-lqkd8" podStartSLOduration=1.281700874 podStartE2EDuration="3.953545393s" podCreationTimestamp="2026-04-20 07:16:01 +0000 UTC" firstStartedPulling="2026-04-20 07:16:01.578030839 +0000 UTC m=+796.398099155" lastFinishedPulling="2026-04-20 07:16:04.249875356 +0000 UTC m=+799.069943674" observedRunningTime="2026-04-20 07:16:04.951374851 +0000 UTC m=+799.771443187" watchObservedRunningTime="2026-04-20 07:16:04.953545393 +0000 UTC m=+799.773613730" Apr 20 07:16:04.973222 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:04.973180 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" podStartSLOduration=2.009926694 podStartE2EDuration="4.973170108s" podCreationTimestamp="2026-04-20 07:16:00 +0000 UTC" firstStartedPulling="2026-04-20 07:16:01.283388419 +0000 UTC m=+796.103456737" lastFinishedPulling="2026-04-20 07:16:04.246631831 +0000 UTC m=+799.066700151" observedRunningTime="2026-04-20 07:16:04.970995031 +0000 UTC m=+799.791063368" watchObservedRunningTime="2026-04-20 07:16:04.973170108 +0000 UTC m=+799.793238444" Apr 20 07:16:06.361052 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.361020 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-855c947695-hfl2t"] Apr 20 07:16:06.364557 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.364539 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:06.367045 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.367024 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 20 07:16:06.367150 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.367052 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 20 07:16:06.374398 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.374373 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-855c947695-hfl2t"] Apr 20 07:16:06.503002 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.502968 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:06.503178 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.503027 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt2dk\" (UniqueName: \"kubernetes.io/projected/d0205875-d5c7-4267-a18c-ba73e15ec183-kube-api-access-xt2dk\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:06.604276 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.604237 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xt2dk\" (UniqueName: \"kubernetes.io/projected/d0205875-d5c7-4267-a18c-ba73e15ec183-kube-api-access-xt2dk\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:06.604437 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.604361 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:06.604491 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:16:06.604457 2543 secret.go:189] Couldn't get secret opendatahub/maas-api-serving-cert: secret "maas-api-serving-cert" not found Apr 20 07:16:06.604525 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:16:06.604515 2543 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls podName:d0205875-d5c7-4267-a18c-ba73e15ec183 nodeName:}" failed. No retries permitted until 2026-04-20 07:16:07.104495544 +0000 UTC m=+801.924563863 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "maas-api-tls" (UniqueName: "kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls") pod "maas-api-855c947695-hfl2t" (UID: "d0205875-d5c7-4267-a18c-ba73e15ec183") : secret "maas-api-serving-cert" not found Apr 20 07:16:06.613273 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:06.613204 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt2dk\" (UniqueName: \"kubernetes.io/projected/d0205875-d5c7-4267-a18c-ba73e15ec183-kube-api-access-xt2dk\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:07.108051 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:07.107998 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:07.110371 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:07.110336 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls\") pod \"maas-api-855c947695-hfl2t\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:07.276779 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:07.276747 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:07.397659 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:07.397594 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-855c947695-hfl2t"] Apr 20 07:16:07.400036 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:16:07.399998 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0205875_d5c7_4267_a18c_ba73e15ec183.slice/crio-d587b1af3b6c084c94bc9fc2c213ae7144eb89fa150100ca65ebc63417780314 WatchSource:0}: Error finding container d587b1af3b6c084c94bc9fc2c213ae7144eb89fa150100ca65ebc63417780314: Status 404 returned error can't find the container with id d587b1af3b6c084c94bc9fc2c213ae7144eb89fa150100ca65ebc63417780314 Apr 20 07:16:07.940167 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:07.940118 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855c947695-hfl2t" event={"ID":"d0205875-d5c7-4267-a18c-ba73e15ec183","Type":"ContainerStarted","Data":"d587b1af3b6c084c94bc9fc2c213ae7144eb89fa150100ca65ebc63417780314"} Apr 20 07:16:08.943909 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:08.943873 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855c947695-hfl2t" event={"ID":"d0205875-d5c7-4267-a18c-ba73e15ec183","Type":"ContainerStarted","Data":"1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053"} Apr 20 07:16:08.944287 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:08.944015 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:14.952045 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:14.952016 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:14.971353 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:14.971300 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-855c947695-hfl2t" podStartSLOduration=7.578367927 podStartE2EDuration="8.971282973s" podCreationTimestamp="2026-04-20 07:16:06 +0000 UTC" firstStartedPulling="2026-04-20 07:16:07.407812732 +0000 UTC m=+802.227881057" lastFinishedPulling="2026-04-20 07:16:08.800727789 +0000 UTC m=+803.620796103" observedRunningTime="2026-04-20 07:16:08.963785969 +0000 UTC m=+803.783854308" watchObservedRunningTime="2026-04-20 07:16:14.971282973 +0000 UTC m=+809.791351311" Apr 20 07:16:15.937943 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:15.937909 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:15.938151 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:15.937994 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:16.000804 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.000773 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-p6vfd"] Apr 20 07:16:16.001191 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.000990 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" podUID="d4e43346-3c6a-4e7a-9008-60344bb0b0c8" containerName="manager" containerID="cri-o://03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04" gracePeriod=10 Apr 20 07:16:16.243181 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.243158 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:16.278905 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.278875 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ldc4v\" (UniqueName: \"kubernetes.io/projected/d4e43346-3c6a-4e7a-9008-60344bb0b0c8-kube-api-access-ldc4v\") pod \"d4e43346-3c6a-4e7a-9008-60344bb0b0c8\" (UID: \"d4e43346-3c6a-4e7a-9008-60344bb0b0c8\") " Apr 20 07:16:16.281205 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.281170 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d4e43346-3c6a-4e7a-9008-60344bb0b0c8-kube-api-access-ldc4v" (OuterVolumeSpecName: "kube-api-access-ldc4v") pod "d4e43346-3c6a-4e7a-9008-60344bb0b0c8" (UID: "d4e43346-3c6a-4e7a-9008-60344bb0b0c8"). InnerVolumeSpecName "kube-api-access-ldc4v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:16:16.379438 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.379397 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ldc4v\" (UniqueName: \"kubernetes.io/projected/d4e43346-3c6a-4e7a-9008-60344bb0b0c8-kube-api-access-ldc4v\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:16:16.970308 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.970269 2543 generic.go:358] "Generic (PLEG): container finished" podID="d4e43346-3c6a-4e7a-9008-60344bb0b0c8" containerID="03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04" exitCode=0 Apr 20 07:16:16.970457 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.970338 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" Apr 20 07:16:16.970457 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.970355 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" event={"ID":"d4e43346-3c6a-4e7a-9008-60344bb0b0c8","Type":"ContainerDied","Data":"03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04"} Apr 20 07:16:16.970457 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.970388 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-p6vfd" event={"ID":"d4e43346-3c6a-4e7a-9008-60344bb0b0c8","Type":"ContainerDied","Data":"ac0308dc303c5573db5df5617020a916820c5ae5c8afb2d39fb747854492f5a3"} Apr 20 07:16:16.970457 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.970403 2543 scope.go:117] "RemoveContainer" containerID="03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04" Apr 20 07:16:16.978130 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.978106 2543 scope.go:117] "RemoveContainer" containerID="03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04" Apr 20 07:16:16.978401 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:16:16.978376 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04\": container with ID starting with 03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04 not found: ID does not exist" containerID="03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04" Apr 20 07:16:16.978464 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.978414 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04"} err="failed to get container status \"03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04\": rpc error: code = NotFound desc = could not find container \"03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04\": container with ID starting with 03a66182eaecf106d2cfab5519de5e14311ea99bca1b13a7342608a887f13b04 not found: ID does not exist" Apr 20 07:16:16.992282 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.992260 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-p6vfd"] Apr 20 07:16:16.995051 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:16.995030 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-p6vfd"] Apr 20 07:16:17.701375 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:17.701341 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d4e43346-3c6a-4e7a-9008-60344bb0b0c8" path="/var/lib/kubelet/pods/d4e43346-3c6a-4e7a-9008-60344bb0b0c8/volumes" Apr 20 07:16:30.937971 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:30.937934 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66594bd994-lqkd8"] Apr 20 07:16:30.938503 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:30.938192 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-66594bd994-lqkd8" podUID="a08577d2-082f-481c-ad89-729b2d38d302" containerName="manager" containerID="cri-o://38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76" gracePeriod=10 Apr 20 07:16:31.174878 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:31.174851 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:31.195873 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:31.195802 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8ncc\" (UniqueName: \"kubernetes.io/projected/a08577d2-082f-481c-ad89-729b2d38d302-kube-api-access-z8ncc\") pod \"a08577d2-082f-481c-ad89-729b2d38d302\" (UID: \"a08577d2-082f-481c-ad89-729b2d38d302\") " Apr 20 07:16:31.197959 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:31.197924 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a08577d2-082f-481c-ad89-729b2d38d302-kube-api-access-z8ncc" (OuterVolumeSpecName: "kube-api-access-z8ncc") pod "a08577d2-082f-481c-ad89-729b2d38d302" (UID: "a08577d2-082f-481c-ad89-729b2d38d302"). InnerVolumeSpecName "kube-api-access-z8ncc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:16:31.297277 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:31.297252 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z8ncc\" (UniqueName: \"kubernetes.io/projected/a08577d2-082f-481c-ad89-729b2d38d302-kube-api-access-z8ncc\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:16:32.020964 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.020882 2543 generic.go:358] "Generic (PLEG): container finished" podID="a08577d2-082f-481c-ad89-729b2d38d302" containerID="38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76" exitCode=0 Apr 20 07:16:32.020964 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.020949 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66594bd994-lqkd8" Apr 20 07:16:32.021401 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.020950 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66594bd994-lqkd8" event={"ID":"a08577d2-082f-481c-ad89-729b2d38d302","Type":"ContainerDied","Data":"38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76"} Apr 20 07:16:32.021401 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.021049 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66594bd994-lqkd8" event={"ID":"a08577d2-082f-481c-ad89-729b2d38d302","Type":"ContainerDied","Data":"bfa3fa5d236e987f1c8f2d0ef79cd381324c296ebe00d007274972d70bd60341"} Apr 20 07:16:32.021401 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.021064 2543 scope.go:117] "RemoveContainer" containerID="38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76" Apr 20 07:16:32.028679 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.028659 2543 scope.go:117] "RemoveContainer" containerID="38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76" Apr 20 07:16:32.028946 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:16:32.028927 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76\": container with ID starting with 38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76 not found: ID does not exist" containerID="38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76" Apr 20 07:16:32.029000 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.028955 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76"} err="failed to get container status \"38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76\": rpc error: code = NotFound desc = could not find container \"38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76\": container with ID starting with 38ee69d53f2a183a6c01d84dda6088f7fefe87c1246959ee5df46dd998672e76 not found: ID does not exist" Apr 20 07:16:32.037308 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.037285 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66594bd994-lqkd8"] Apr 20 07:16:32.044061 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:32.044041 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-66594bd994-lqkd8"] Apr 20 07:16:33.701798 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:33.701767 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a08577d2-082f-481c-ad89-729b2d38d302" path="/var/lib/kubelet/pods/a08577d2-082f-481c-ad89-729b2d38d302/volumes" Apr 20 07:16:38.971621 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.971581 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-966769f95-qcm9x"] Apr 20 07:16:38.972118 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.972028 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a08577d2-082f-481c-ad89-729b2d38d302" containerName="manager" Apr 20 07:16:38.972118 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.972047 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="a08577d2-082f-481c-ad89-729b2d38d302" containerName="manager" Apr 20 07:16:38.972118 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.972059 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d4e43346-3c6a-4e7a-9008-60344bb0b0c8" containerName="manager" Apr 20 07:16:38.972118 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.972067 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="d4e43346-3c6a-4e7a-9008-60344bb0b0c8" containerName="manager" Apr 20 07:16:38.972319 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.972137 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="d4e43346-3c6a-4e7a-9008-60344bb0b0c8" containerName="manager" Apr 20 07:16:38.972319 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.972149 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="a08577d2-082f-481c-ad89-729b2d38d302" containerName="manager" Apr 20 07:16:38.977569 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.977546 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:38.980267 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.980249 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-rdqfb\"" Apr 20 07:16:38.985358 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:38.985334 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-966769f95-qcm9x"] Apr 20 07:16:39.057200 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.057168 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-787zr\" (UniqueName: \"kubernetes.io/projected/a87a2c0d-8732-425f-a4cb-36f5fb3d4df6-kube-api-access-787zr\") pod \"maas-api-966769f95-qcm9x\" (UID: \"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6\") " pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.057364 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.057217 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a87a2c0d-8732-425f-a4cb-36f5fb3d4df6-maas-api-tls\") pod \"maas-api-966769f95-qcm9x\" (UID: \"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6\") " pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.158616 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.158581 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-787zr\" (UniqueName: \"kubernetes.io/projected/a87a2c0d-8732-425f-a4cb-36f5fb3d4df6-kube-api-access-787zr\") pod \"maas-api-966769f95-qcm9x\" (UID: \"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6\") " pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.158830 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.158665 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a87a2c0d-8732-425f-a4cb-36f5fb3d4df6-maas-api-tls\") pod \"maas-api-966769f95-qcm9x\" (UID: \"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6\") " pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.161098 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.161073 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/a87a2c0d-8732-425f-a4cb-36f5fb3d4df6-maas-api-tls\") pod \"maas-api-966769f95-qcm9x\" (UID: \"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6\") " pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.167373 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.167353 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-787zr\" (UniqueName: \"kubernetes.io/projected/a87a2c0d-8732-425f-a4cb-36f5fb3d4df6-kube-api-access-787zr\") pod \"maas-api-966769f95-qcm9x\" (UID: \"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6\") " pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.288342 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.288254 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:39.408394 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:39.408370 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-966769f95-qcm9x"] Apr 20 07:16:39.410750 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:16:39.410720 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda87a2c0d_8732_425f_a4cb_36f5fb3d4df6.slice/crio-5328e7f4c9e8a6a341c9d7670554179ff046f6df556ea9ffd7ef53a249ea4bb8 WatchSource:0}: Error finding container 5328e7f4c9e8a6a341c9d7670554179ff046f6df556ea9ffd7ef53a249ea4bb8: Status 404 returned error can't find the container with id 5328e7f4c9e8a6a341c9d7670554179ff046f6df556ea9ffd7ef53a249ea4bb8 Apr 20 07:16:40.048192 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:40.048134 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-966769f95-qcm9x" event={"ID":"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6","Type":"ContainerStarted","Data":"5328e7f4c9e8a6a341c9d7670554179ff046f6df556ea9ffd7ef53a249ea4bb8"} Apr 20 07:16:41.052161 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:41.052119 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-966769f95-qcm9x" event={"ID":"a87a2c0d-8732-425f-a4cb-36f5fb3d4df6","Type":"ContainerStarted","Data":"0533c6cb68a34285cb7f65e0a92472554a20afa9803bc3abe0fa1f4e46c0099b"} Apr 20 07:16:41.052617 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:41.052240 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:41.070209 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:41.070164 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-966769f95-qcm9x" podStartSLOduration=1.64129049 podStartE2EDuration="3.070150422s" podCreationTimestamp="2026-04-20 07:16:38 +0000 UTC" firstStartedPulling="2026-04-20 07:16:39.411905692 +0000 UTC m=+834.231974006" lastFinishedPulling="2026-04-20 07:16:40.840765623 +0000 UTC m=+835.660833938" observedRunningTime="2026-04-20 07:16:41.068250547 +0000 UTC m=+835.888318884" watchObservedRunningTime="2026-04-20 07:16:41.070150422 +0000 UTC m=+835.890218781" Apr 20 07:16:47.061233 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.061206 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-966769f95-qcm9x" Apr 20 07:16:47.124367 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.124327 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-855c947695-hfl2t"] Apr 20 07:16:47.124632 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.124583 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-855c947695-hfl2t" podUID="d0205875-d5c7-4267-a18c-ba73e15ec183" containerName="maas-api" containerID="cri-o://1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053" gracePeriod=30 Apr 20 07:16:47.387056 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.387029 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:47.427484 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.427449 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt2dk\" (UniqueName: \"kubernetes.io/projected/d0205875-d5c7-4267-a18c-ba73e15ec183-kube-api-access-xt2dk\") pod \"d0205875-d5c7-4267-a18c-ba73e15ec183\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " Apr 20 07:16:47.427615 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.427519 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls\") pod \"d0205875-d5c7-4267-a18c-ba73e15ec183\" (UID: \"d0205875-d5c7-4267-a18c-ba73e15ec183\") " Apr 20 07:16:47.429535 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.429505 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0205875-d5c7-4267-a18c-ba73e15ec183-kube-api-access-xt2dk" (OuterVolumeSpecName: "kube-api-access-xt2dk") pod "d0205875-d5c7-4267-a18c-ba73e15ec183" (UID: "d0205875-d5c7-4267-a18c-ba73e15ec183"). InnerVolumeSpecName "kube-api-access-xt2dk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:16:47.429651 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.429555 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "d0205875-d5c7-4267-a18c-ba73e15ec183" (UID: "d0205875-d5c7-4267-a18c-ba73e15ec183"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:16:47.528840 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.528810 2543 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/d0205875-d5c7-4267-a18c-ba73e15ec183-maas-api-tls\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:16:47.528840 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:47.528835 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xt2dk\" (UniqueName: \"kubernetes.io/projected/d0205875-d5c7-4267-a18c-ba73e15ec183-kube-api-access-xt2dk\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:16:48.074184 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.074103 2543 generic.go:358] "Generic (PLEG): container finished" podID="d0205875-d5c7-4267-a18c-ba73e15ec183" containerID="1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053" exitCode=0 Apr 20 07:16:48.074184 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.074169 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-855c947695-hfl2t" Apr 20 07:16:48.074184 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.074167 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855c947695-hfl2t" event={"ID":"d0205875-d5c7-4267-a18c-ba73e15ec183","Type":"ContainerDied","Data":"1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053"} Apr 20 07:16:48.074706 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.074212 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-855c947695-hfl2t" event={"ID":"d0205875-d5c7-4267-a18c-ba73e15ec183","Type":"ContainerDied","Data":"d587b1af3b6c084c94bc9fc2c213ae7144eb89fa150100ca65ebc63417780314"} Apr 20 07:16:48.074706 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.074232 2543 scope.go:117] "RemoveContainer" containerID="1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053" Apr 20 07:16:48.081685 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.081668 2543 scope.go:117] "RemoveContainer" containerID="1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053" Apr 20 07:16:48.081931 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:16:48.081914 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053\": container with ID starting with 1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053 not found: ID does not exist" containerID="1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053" Apr 20 07:16:48.081985 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.081939 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053"} err="failed to get container status \"1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053\": rpc error: code = NotFound desc = could not find container \"1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053\": container with ID starting with 1dd9ab788fb54d3d895dcffc22d25b4f43c6755be442eec7b8c7fc5862d82053 not found: ID does not exist" Apr 20 07:16:48.094321 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.094303 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-855c947695-hfl2t"] Apr 20 07:16:48.098056 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:48.098038 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-855c947695-hfl2t"] Apr 20 07:16:49.060997 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.060965 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt"] Apr 20 07:16:49.061271 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.061259 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d0205875-d5c7-4267-a18c-ba73e15ec183" containerName="maas-api" Apr 20 07:16:49.061271 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.061272 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0205875-d5c7-4267-a18c-ba73e15ec183" containerName="maas-api" Apr 20 07:16:49.061356 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.061337 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="d0205875-d5c7-4267-a18c-ba73e15ec183" containerName="maas-api" Apr 20 07:16:49.065677 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.065661 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.068164 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.068134 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 20 07:16:49.068953 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.068936 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 20 07:16:49.068953 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.068943 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 20 07:16:49.069095 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.069058 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-w5vjj\"" Apr 20 07:16:49.074742 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.074719 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt"] Apr 20 07:16:49.142127 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.142093 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpsxc\" (UniqueName: \"kubernetes.io/projected/9068c160-e349-48fd-be72-92fec44132c4-kube-api-access-jpsxc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.142271 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.142152 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.142334 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.142275 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.142381 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.142335 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.142441 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.142418 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.142490 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.142470 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9068c160-e349-48fd-be72-92fec44132c4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.243726 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.243688 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.243726 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.243726 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.243936 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.243763 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9068c160-e349-48fd-be72-92fec44132c4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.243936 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.243815 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jpsxc\" (UniqueName: \"kubernetes.io/projected/9068c160-e349-48fd-be72-92fec44132c4-kube-api-access-jpsxc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.243936 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.243858 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.243936 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.243905 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.244120 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.244079 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.244205 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.244184 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.244262 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.244245 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.246098 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.246069 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9068c160-e349-48fd-be72-92fec44132c4-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.246310 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.246293 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9068c160-e349-48fd-be72-92fec44132c4-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.251305 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.251281 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpsxc\" (UniqueName: \"kubernetes.io/projected/9068c160-e349-48fd-be72-92fec44132c4-kube-api-access-jpsxc\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt\" (UID: \"9068c160-e349-48fd-be72-92fec44132c4\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.375767 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.375672 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:16:49.501140 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.501111 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt"] Apr 20 07:16:49.504017 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:16:49.503992 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9068c160_e349_48fd_be72_92fec44132c4.slice/crio-6b8e0091982a10396a72e2c492acd6743b01d2e1cfe996738d3e113461b97c76 WatchSource:0}: Error finding container 6b8e0091982a10396a72e2c492acd6743b01d2e1cfe996738d3e113461b97c76: Status 404 returned error can't find the container with id 6b8e0091982a10396a72e2c492acd6743b01d2e1cfe996738d3e113461b97c76 Apr 20 07:16:49.702627 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:49.702595 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0205875-d5c7-4267-a18c-ba73e15ec183" path="/var/lib/kubelet/pods/d0205875-d5c7-4267-a18c-ba73e15ec183/volumes" Apr 20 07:16:50.083213 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:50.083122 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" event={"ID":"9068c160-e349-48fd-be72-92fec44132c4","Type":"ContainerStarted","Data":"6b8e0091982a10396a72e2c492acd6743b01d2e1cfe996738d3e113461b97c76"} Apr 20 07:16:55.104965 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:16:55.104879 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" event={"ID":"9068c160-e349-48fd-be72-92fec44132c4","Type":"ContainerStarted","Data":"725e689d6e2c5b62f26861a478328188609bfbf1c413b094835061818f635279"} Apr 20 07:17:01.125013 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:01.124981 2543 generic.go:358] "Generic (PLEG): container finished" podID="9068c160-e349-48fd-be72-92fec44132c4" containerID="725e689d6e2c5b62f26861a478328188609bfbf1c413b094835061818f635279" exitCode=0 Apr 20 07:17:01.125394 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:01.125068 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" event={"ID":"9068c160-e349-48fd-be72-92fec44132c4","Type":"ContainerDied","Data":"725e689d6e2c5b62f26861a478328188609bfbf1c413b094835061818f635279"} Apr 20 07:17:03.134390 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:03.134358 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" event={"ID":"9068c160-e349-48fd-be72-92fec44132c4","Type":"ContainerStarted","Data":"3a9e558c6f94f5e9929aebf09a06b59974654fc9de92380bcaab869dd917d61a"} Apr 20 07:17:03.134817 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:03.134583 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:17:03.154606 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:03.154551 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" podStartSLOduration=1.4130998 podStartE2EDuration="14.154534541s" podCreationTimestamp="2026-04-20 07:16:49 +0000 UTC" firstStartedPulling="2026-04-20 07:16:49.505802378 +0000 UTC m=+844.325870706" lastFinishedPulling="2026-04-20 07:17:02.24723713 +0000 UTC m=+857.067305447" observedRunningTime="2026-04-20 07:17:03.153525235 +0000 UTC m=+857.973593574" watchObservedRunningTime="2026-04-20 07:17:03.154534541 +0000 UTC m=+857.974602879" Apr 20 07:17:14.150141 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:14.150110 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt" Apr 20 07:17:22.667062 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.667030 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8"] Apr 20 07:17:22.702743 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.702716 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8"] Apr 20 07:17:22.702883 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.702833 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.706001 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.705974 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 20 07:17:22.730707 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.730681 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.730833 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.730712 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.730833 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.730736 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xjdzc\" (UniqueName: \"kubernetes.io/projected/9f3d2b06-a8b6-44f5-a586-703555452b7b-kube-api-access-xjdzc\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.730833 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.730782 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.731001 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.730897 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.731001 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.730946 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f3d2b06-a8b6-44f5-a586-703555452b7b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832018 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.831986 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832143 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832022 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832143 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832045 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xjdzc\" (UniqueName: \"kubernetes.io/projected/9f3d2b06-a8b6-44f5-a586-703555452b7b-kube-api-access-xjdzc\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832143 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832078 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832155 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832303 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832188 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f3d2b06-a8b6-44f5-a586-703555452b7b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832480 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832453 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832554 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832496 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.832554 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.832544 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.834124 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.834104 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9f3d2b06-a8b6-44f5-a586-703555452b7b-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.834462 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.834446 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9f3d2b06-a8b6-44f5-a586-703555452b7b-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:22.839968 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:22.839946 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xjdzc\" (UniqueName: \"kubernetes.io/projected/9f3d2b06-a8b6-44f5-a586-703555452b7b-kube-api-access-xjdzc\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8\" (UID: \"9f3d2b06-a8b6-44f5-a586-703555452b7b\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:23.034755 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:23.034720 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:23.155257 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:23.155233 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8"] Apr 20 07:17:23.157831 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:17:23.157802 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f3d2b06_a8b6_44f5_a586_703555452b7b.slice/crio-786a81cf7d446417fe3b1d891506fbc2d9e5e958249db24536243482b975ec8e WatchSource:0}: Error finding container 786a81cf7d446417fe3b1d891506fbc2d9e5e958249db24536243482b975ec8e: Status 404 returned error can't find the container with id 786a81cf7d446417fe3b1d891506fbc2d9e5e958249db24536243482b975ec8e Apr 20 07:17:23.199891 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:23.199863 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" event={"ID":"9f3d2b06-a8b6-44f5-a586-703555452b7b","Type":"ContainerStarted","Data":"786a81cf7d446417fe3b1d891506fbc2d9e5e958249db24536243482b975ec8e"} Apr 20 07:17:24.205111 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:24.205072 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" event={"ID":"9f3d2b06-a8b6-44f5-a586-703555452b7b","Type":"ContainerStarted","Data":"cfc2c892ea87c0b12f48129db086badbbff8d8dde328714954d7c94c81464890"} Apr 20 07:17:25.015154 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.015117 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk"] Apr 20 07:17:25.018511 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.018495 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.024229 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.024199 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 20 07:17:25.032512 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.032490 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk"] Apr 20 07:17:25.051709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.051336 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.051709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.051400 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.051709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.051446 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2sfk\" (UniqueName: \"kubernetes.io/projected/11decf5d-7437-44a1-9035-28888cd5e734-kube-api-access-w2sfk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.051709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.051542 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11decf5d-7437-44a1-9035-28888cd5e734-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.051709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.051578 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.051709 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.051627 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.152592 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.152559 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11decf5d-7437-44a1-9035-28888cd5e734-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.152592 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.152599 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.152946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.152625 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.152946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.152699 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.152946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.152732 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.152946 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.152769 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w2sfk\" (UniqueName: \"kubernetes.io/projected/11decf5d-7437-44a1-9035-28888cd5e734-kube-api-access-w2sfk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.153220 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.153168 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.153220 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.153202 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.153416 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.153393 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.155068 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.155043 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/11decf5d-7437-44a1-9035-28888cd5e734-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.155315 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.155297 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/11decf5d-7437-44a1-9035-28888cd5e734-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.166570 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.166543 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w2sfk\" (UniqueName: \"kubernetes.io/projected/11decf5d-7437-44a1-9035-28888cd5e734-kube-api-access-w2sfk\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk\" (UID: \"11decf5d-7437-44a1-9035-28888cd5e734\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.328601 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.328513 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:25.455250 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:25.455222 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk"] Apr 20 07:17:25.457126 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:17:25.457088 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11decf5d_7437_44a1_9035_28888cd5e734.slice/crio-1bfbb439d1f1b5ccdba418f8c2c2e3b19ea2697fb6b0df465cbf1fdbb6edbf61 WatchSource:0}: Error finding container 1bfbb439d1f1b5ccdba418f8c2c2e3b19ea2697fb6b0df465cbf1fdbb6edbf61: Status 404 returned error can't find the container with id 1bfbb439d1f1b5ccdba418f8c2c2e3b19ea2697fb6b0df465cbf1fdbb6edbf61 Apr 20 07:17:26.214169 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:26.214128 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" event={"ID":"11decf5d-7437-44a1-9035-28888cd5e734","Type":"ContainerStarted","Data":"596278cd6048c7392a5ae07a8c5bcc798ec9aab164554baee77e086324bea649"} Apr 20 07:17:26.214169 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:26.214174 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" event={"ID":"11decf5d-7437-44a1-9035-28888cd5e734","Type":"ContainerStarted","Data":"1bfbb439d1f1b5ccdba418f8c2c2e3b19ea2697fb6b0df465cbf1fdbb6edbf61"} Apr 20 07:17:29.230855 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:29.230819 2543 generic.go:358] "Generic (PLEG): container finished" podID="9f3d2b06-a8b6-44f5-a586-703555452b7b" containerID="cfc2c892ea87c0b12f48129db086badbbff8d8dde328714954d7c94c81464890" exitCode=0 Apr 20 07:17:29.231292 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:29.230868 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" event={"ID":"9f3d2b06-a8b6-44f5-a586-703555452b7b","Type":"ContainerDied","Data":"cfc2c892ea87c0b12f48129db086badbbff8d8dde328714954d7c94c81464890"} Apr 20 07:17:29.231454 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:29.231437 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:17:30.236240 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:30.236208 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" event={"ID":"9f3d2b06-a8b6-44f5-a586-703555452b7b","Type":"ContainerStarted","Data":"4aed1320d415cfad07ec732694586e922a342ad1de18f886769f13b916b995ab"} Apr 20 07:17:30.236581 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:30.236425 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:30.259002 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:30.258942 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" podStartSLOduration=8.015335052 podStartE2EDuration="8.258926362s" podCreationTimestamp="2026-04-20 07:17:22 +0000 UTC" firstStartedPulling="2026-04-20 07:17:29.231551164 +0000 UTC m=+884.051619479" lastFinishedPulling="2026-04-20 07:17:29.475142461 +0000 UTC m=+884.295210789" observedRunningTime="2026-04-20 07:17:30.258022296 +0000 UTC m=+885.078090632" watchObservedRunningTime="2026-04-20 07:17:30.258926362 +0000 UTC m=+885.078994700" Apr 20 07:17:31.240525 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:31.240493 2543 generic.go:358] "Generic (PLEG): container finished" podID="11decf5d-7437-44a1-9035-28888cd5e734" containerID="596278cd6048c7392a5ae07a8c5bcc798ec9aab164554baee77e086324bea649" exitCode=0 Apr 20 07:17:31.240921 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:31.240565 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" event={"ID":"11decf5d-7437-44a1-9035-28888cd5e734","Type":"ContainerDied","Data":"596278cd6048c7392a5ae07a8c5bcc798ec9aab164554baee77e086324bea649"} Apr 20 07:17:32.244769 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:32.244736 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" event={"ID":"11decf5d-7437-44a1-9035-28888cd5e734","Type":"ContainerStarted","Data":"70a0d20cdb1184cc68c7693817ef35d8bb6a469b04ba1a9f6d79b0bf8e8824be"} Apr 20 07:17:32.245148 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:32.244944 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:32.264584 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:32.264544 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" podStartSLOduration=8.078587804 podStartE2EDuration="8.264530752s" podCreationTimestamp="2026-04-20 07:17:24 +0000 UTC" firstStartedPulling="2026-04-20 07:17:31.241275376 +0000 UTC m=+886.061343692" lastFinishedPulling="2026-04-20 07:17:31.427218323 +0000 UTC m=+886.247286640" observedRunningTime="2026-04-20 07:17:32.263340299 +0000 UTC m=+887.083408636" watchObservedRunningTime="2026-04-20 07:17:32.264530752 +0000 UTC m=+887.084599085" Apr 20 07:17:41.254343 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:41.254305 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8" Apr 20 07:17:43.260695 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:43.260662 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk" Apr 20 07:17:46.870875 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:46.870846 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:17:46.872034 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:17:46.872013 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:18:15.768466 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:15.768421 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-745b8f576-lkbm6"] Apr 20 07:18:15.768966 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:15.768695 2543 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-745b8f576-lkbm6" podUID="c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" containerName="authorino" containerID="cri-o://f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee" gracePeriod=30 Apr 20 07:18:16.013947 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.013917 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:18:16.083097 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.083019 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-tls-cert\") pod \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " Apr 20 07:18:16.083097 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.083079 2543 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tc2k\" (UniqueName: \"kubernetes.io/projected/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-kube-api-access-7tc2k\") pod \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\" (UID: \"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a\") " Apr 20 07:18:16.085117 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.085085 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-kube-api-access-7tc2k" (OuterVolumeSpecName: "kube-api-access-7tc2k") pod "c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" (UID: "c8cdec68-d7cf-492f-ae68-ed8c1f894f8a"). InnerVolumeSpecName "kube-api-access-7tc2k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 20 07:18:16.094058 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.094029 2543 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-tls-cert" (OuterVolumeSpecName: "tls-cert") pod "c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" (UID: "c8cdec68-d7cf-492f-ae68-ed8c1f894f8a"). InnerVolumeSpecName "tls-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 20 07:18:16.185929 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.184335 2543 reconciler_common.go:299] "Volume detached for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-tls-cert\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:18:16.185929 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.184375 2543 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7tc2k\" (UniqueName: \"kubernetes.io/projected/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a-kube-api-access-7tc2k\") on node \"ip-10-0-130-105.ec2.internal\" DevicePath \"\"" Apr 20 07:18:16.384054 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.383968 2543 generic.go:358] "Generic (PLEG): container finished" podID="c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" containerID="f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee" exitCode=0 Apr 20 07:18:16.384054 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.384035 2543 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-745b8f576-lkbm6" Apr 20 07:18:16.384222 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.384061 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-745b8f576-lkbm6" event={"ID":"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a","Type":"ContainerDied","Data":"f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee"} Apr 20 07:18:16.384222 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.384095 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-745b8f576-lkbm6" event={"ID":"c8cdec68-d7cf-492f-ae68-ed8c1f894f8a","Type":"ContainerDied","Data":"eadaf367546a1907252ec8a1fe773c160daa94c81cf78641a1c0c9f106b4f1e9"} Apr 20 07:18:16.384222 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.384109 2543 scope.go:117] "RemoveContainer" containerID="f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee" Apr 20 07:18:16.392635 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.392620 2543 scope.go:117] "RemoveContainer" containerID="f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee" Apr 20 07:18:16.392923 ip-10-0-130-105 kubenswrapper[2543]: E0420 07:18:16.392903 2543 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee\": container with ID starting with f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee not found: ID does not exist" containerID="f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee" Apr 20 07:18:16.392977 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.392932 2543 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee"} err="failed to get container status \"f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee\": rpc error: code = NotFound desc = could not find container \"f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee\": container with ID starting with f55b3678e9e34424536f677facaaba2d7dc32e86c5de9eb96408e76cf19c59ee not found: ID does not exist" Apr 20 07:18:16.414981 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.414957 2543 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-745b8f576-lkbm6"] Apr 20 07:18:16.421745 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:16.421726 2543 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-745b8f576-lkbm6"] Apr 20 07:18:17.701329 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:18:17.701290 2543 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" path="/var/lib/kubelet/pods/c8cdec68-d7cf-492f-ae68-ed8c1f894f8a/volumes" Apr 20 07:22:46.891726 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:22:46.891695 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:22:46.895950 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:22:46.895930 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:27:46.910758 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:27:46.910731 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:27:46.916080 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:27:46.916057 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:32:46.931339 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:32:46.931311 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:32:46.937678 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:32:46.937657 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:37:46.951527 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:37:46.951495 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:37:46.959101 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:37:46.959078 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:40:40.994213 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:40.994181 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-966769f95-qcm9x_a87a2c0d-8732-425f-a4cb-36f5fb3d4df6/maas-api/0.log" Apr 20 07:40:41.454087 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:41.454053 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6d65d76454-l6j6d_35209c86-3a05-4072-8e51-0acbba6419bd/manager/0.log" Apr 20 07:40:42.962678 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:42.962649 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-ktkdj_9d93355e-6deb-4c79-9e36-3804fd5e950a/manager/0.log" Apr 20 07:40:43.278006 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:43.277932 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-h6lrs_ab57ebb1-9bf6-42c6-b548-9656356e12f3/registry-server/0.log" Apr 20 07:40:43.928352 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:43.928321 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fj4rwf_f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed/istio-proxy/0.log" Apr 20 07:40:44.351028 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:44.350980 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-wvtr4_73906400-114f-406d-aa5d-b27617fa0457/istio-proxy/0.log" Apr 20 07:40:44.773319 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:44.773293 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt_9068c160-e349-48fd-be72-92fec44132c4/main/0.log" Apr 20 07:40:44.779413 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:44.779390 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-kw9vt_9068c160-e349-48fd-be72-92fec44132c4/storage-initializer/0.log" Apr 20 07:40:45.117732 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:45.117656 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk_11decf5d-7437-44a1-9035-28888cd5e734/storage-initializer/0.log" Apr 20 07:40:45.124001 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:45.123983 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdccztqkk_11decf5d-7437-44a1-9035-28888cd5e734/main/0.log" Apr 20 07:40:45.228467 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:45.228442 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8_9f3d2b06-a8b6-44f5-a586-703555452b7b/storage-initializer/0.log" Apr 20 07:40:45.234934 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:45.234917 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-h5kw8_9f3d2b06-a8b6-44f5-a586-703555452b7b/main/0.log" Apr 20 07:40:51.911610 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:51.911579 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9v756_58183227-95cc-42f1-95e7-dc40dc8bf51d/global-pull-secret-syncer/0.log" Apr 20 07:40:52.042181 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:52.042154 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-cqsc9_9b6dda66-f1f4-4fee-8eb5-1f6447d2d431/konnectivity-agent/0.log" Apr 20 07:40:52.103805 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:52.103779 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-130-105.ec2.internal_486c974e6666e028518e848e7f47b828/haproxy/0.log" Apr 20 07:40:56.145275 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:56.145245 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-ktkdj_9d93355e-6deb-4c79-9e36-3804fd5e950a/manager/0.log" Apr 20 07:40:56.252527 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:56.252487 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-catalog-h6lrs_ab57ebb1-9bf6-42c6-b548-9656356e12f3/registry-server/0.log" Apr 20 07:40:58.342985 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:58.342954 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2qnqf_b1c3d808-af41-473f-beb5-7fb97e343128/node-exporter/0.log" Apr 20 07:40:58.363100 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:58.363072 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2qnqf_b1c3d808-af41-473f-beb5-7fb97e343128/kube-rbac-proxy/0.log" Apr 20 07:40:58.385003 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:40:58.384981 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-2qnqf_b1c3d808-af41-473f-beb5-7fb97e343128/init-textfile/0.log" Apr 20 07:41:00.743525 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.743491 2543 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7"] Apr 20 07:41:00.743919 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.743835 2543 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" containerName="authorino" Apr 20 07:41:00.743919 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.743846 2543 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" containerName="authorino" Apr 20 07:41:00.743919 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.743900 2543 memory_manager.go:356] "RemoveStaleState removing state" podUID="c8cdec68-d7cf-492f-ae68-ed8c1f894f8a" containerName="authorino" Apr 20 07:41:00.746911 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.746892 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.749123 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.749098 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xfwg2\"/\"openshift-service-ca.crt\"" Apr 20 07:41:00.749899 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.749881 2543 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xfwg2\"/\"default-dockercfg-df242\"" Apr 20 07:41:00.749996 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.749917 2543 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xfwg2\"/\"kube-root-ca.crt\"" Apr 20 07:41:00.753624 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.753532 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7"] Apr 20 07:41:00.853125 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.853092 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-proc\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.853258 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.853133 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-sys\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.853258 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.853166 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-podres\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.853336 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.853258 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-lib-modules\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.853336 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.853287 2543 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g24dd\" (UniqueName: \"kubernetes.io/projected/d5ce935a-2a69-4d13-ab13-6a13fac51097-kube-api-access-g24dd\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954589 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954561 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-proc\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954598 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-sys\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954659 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-podres\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954701 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-proc\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954730 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954712 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-lib-modules\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954864 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954737 2543 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g24dd\" (UniqueName: \"kubernetes.io/projected/d5ce935a-2a69-4d13-ab13-6a13fac51097-kube-api-access-g24dd\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954864 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954753 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-sys\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954864 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954809 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-podres\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.954956 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.954870 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/d5ce935a-2a69-4d13-ab13-6a13fac51097-lib-modules\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:00.962886 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:00.962857 2543 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g24dd\" (UniqueName: \"kubernetes.io/projected/d5ce935a-2a69-4d13-ab13-6a13fac51097-kube-api-access-g24dd\") pod \"perf-node-gather-daemonset-7rtl7\" (UID: \"d5ce935a-2a69-4d13-ab13-6a13fac51097\") " pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:01.056696 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.056597 2543 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:01.174129 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.174102 2543 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7"] Apr 20 07:41:01.176993 ip-10-0-130-105 kubenswrapper[2543]: W0420 07:41:01.176965 2543 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podd5ce935a_2a69_4d13_ab13_6a13fac51097.slice/crio-affc1f3266fdd3a22a30fd2b195029bd5929bc90f84bd2d459eba873863a6ba7 WatchSource:0}: Error finding container affc1f3266fdd3a22a30fd2b195029bd5929bc90f84bd2d459eba873863a6ba7: Status 404 returned error can't find the container with id affc1f3266fdd3a22a30fd2b195029bd5929bc90f84bd2d459eba873863a6ba7 Apr 20 07:41:01.178849 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.178832 2543 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 20 07:41:01.685430 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.685387 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" event={"ID":"d5ce935a-2a69-4d13-ab13-6a13fac51097","Type":"ContainerStarted","Data":"148fa224f85f944103315543e2465979ecd6d21a3435a76d4ea7fc06b6300a9e"} Apr 20 07:41:01.685430 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.685429 2543 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" event={"ID":"d5ce935a-2a69-4d13-ab13-6a13fac51097","Type":"ContainerStarted","Data":"affc1f3266fdd3a22a30fd2b195029bd5929bc90f84bd2d459eba873863a6ba7"} Apr 20 07:41:01.685670 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.685514 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:01.701702 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:01.701634 2543 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" podStartSLOduration=1.701618479 podStartE2EDuration="1.701618479s" podCreationTimestamp="2026-04-20 07:41:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-20 07:41:01.700232753 +0000 UTC m=+2296.520301088" watchObservedRunningTime="2026-04-20 07:41:01.701618479 +0000 UTC m=+2296.521686815" Apr 20 07:41:02.761555 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:02.761526 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5kxb2_ff2f3e47-aed8-4328-8fce-7d39dbc0d939/dns/0.log" Apr 20 07:41:02.780829 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:02.780805 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-5kxb2_ff2f3e47-aed8-4328-8fce-7d39dbc0d939/kube-rbac-proxy/0.log" Apr 20 07:41:02.909307 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:02.909268 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-m98bf_2cb0ae1e-b4a8-4cf9-ac1d-3d8fa3a4cc47/dns-node-resolver/0.log" Apr 20 07:41:03.376145 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:03.376100 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-86468fb89f-ptbl4_4346ef67-5625-4eac-bc03-69ca61915d3a/registry/0.log" Apr 20 07:41:03.416586 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:03.416558 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-d7hcj_dcda05ad-f9cc-4648-90c6-e224049a2518/node-ca/0.log" Apr 20 07:41:04.244585 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:04.244558 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_data-science-gateway-data-science-gateway-class-55cc67557fj4rwf_f98f78e6-c615-4e4b-8ec6-0c9361e9f6ed/istio-proxy/0.log" Apr 20 07:41:04.515025 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:04.514944 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_maas-default-gateway-openshift-default-845c6b4b48-wvtr4_73906400-114f-406d-aa5d-b27617fa0457/istio-proxy/0.log" Apr 20 07:41:05.084295 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:05.084255 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-sw57v_7a3a9194-e692-46ef-949c-604a76b49aad/serve-healthcheck-canary/0.log" Apr 20 07:41:05.572509 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:05.572479 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fd69r_768e3c1e-465b-460a-a6b0-4f956e56e270/kube-rbac-proxy/0.log" Apr 20 07:41:05.592076 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:05.592051 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fd69r_768e3c1e-465b-460a-a6b0-4f956e56e270/exporter/0.log" Apr 20 07:41:05.611567 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:05.611546 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-fd69r_768e3c1e-465b-460a-a6b0-4f956e56e270/extractor/0.log" Apr 20 07:41:07.654268 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:07.654243 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-966769f95-qcm9x_a87a2c0d-8732-425f-a4cb-36f5fb3d4df6/maas-api/0.log" Apr 20 07:41:07.701190 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:07.701166 2543 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xfwg2/perf-node-gather-daemonset-7rtl7" Apr 20 07:41:07.872546 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:07.872516 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-6d65d76454-l6j6d_35209c86-3a05-4072-8e51-0acbba6419bd/manager/0.log" Apr 20 07:41:09.082834 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:09.082808 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-7fd89bcbc4-vqr2p_ba26dfdd-dc3c-4027-9922-6ef89186d95d/manager/0.log" Apr 20 07:41:13.481147 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:13.481120 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ftkkp_72dd3d8d-7973-437e-b710-621b728bcecb/migrator/0.log" Apr 20 07:41:13.500689 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:13.500664 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-ftkkp_72dd3d8d-7973-437e-b710-621b728bcecb/graceful-termination/0.log" Apr 20 07:41:15.019149 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.019119 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/kube-multus-additional-cni-plugins/0.log" Apr 20 07:41:15.043409 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.043385 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/egress-router-binary-copy/0.log" Apr 20 07:41:15.064589 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.064565 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/cni-plugins/0.log" Apr 20 07:41:15.083940 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.083923 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/bond-cni-plugin/0.log" Apr 20 07:41:15.103481 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.103460 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/routeoverride-cni/0.log" Apr 20 07:41:15.122905 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.122879 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/whereabouts-cni-bincopy/0.log" Apr 20 07:41:15.141568 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.141547 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-nfcgq_3b7c7e9a-ffad-4d5e-a28c-965555ef617c/whereabouts-cni/0.log" Apr 20 07:41:15.378718 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.375662 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-j9xjk_a03fabd4-3e41-402b-bcc3-7e6cdff4e5cf/kube-multus/0.log" Apr 20 07:41:15.434030 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.434004 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wgd5x_4ffbcc28-a10c-467a-a9e7-e31e20e4975e/network-metrics-daemon/0.log" Apr 20 07:41:15.451863 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:15.451841 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-wgd5x_4ffbcc28-a10c-467a-a9e7-e31e20e4975e/kube-rbac-proxy/0.log" Apr 20 07:41:16.266346 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.266320 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-controller/0.log" Apr 20 07:41:16.284051 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.284020 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/0.log" Apr 20 07:41:16.293353 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.293334 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovn-acl-logging/1.log" Apr 20 07:41:16.309563 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.309541 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/kube-rbac-proxy-node/0.log" Apr 20 07:41:16.329573 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.329550 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/kube-rbac-proxy-ovn-metrics/0.log" Apr 20 07:41:16.349047 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.349027 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/northd/0.log" Apr 20 07:41:16.368111 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.368091 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/nbdb/0.log" Apr 20 07:41:16.387066 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.387051 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/sbdb/0.log" Apr 20 07:41:16.479531 ip-10-0-130-105 kubenswrapper[2543]: I0420 07:41:16.479504 2543 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-5bfnd_13fb38e8-b7f0-4b6a-94e1-f8426903d19f/ovnkube-controller/0.log"