Apr 22 16:18:39.248201 ip-10-0-137-144 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 16:18:39.248213 ip-10-0-137-144 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 16:18:39.248222 ip-10-0-137-144 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 16:18:39.248554 ip-10-0-137-144 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 16:18:49.473386 ip-10-0-137-144 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 16:18:49.473403 ip-10-0-137-144 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 83fbcdff0b9a478a82f14646624d1d51 -- Apr 22 16:21:14.345141 ip-10-0-137-144 systemd[1]: Starting Kubernetes Kubelet... Apr 22 16:21:14.777057 ip-10-0-137-144 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:14.777057 ip-10-0-137-144 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 16:21:14.777057 ip-10-0-137-144 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:14.777057 ip-10-0-137-144 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 16:21:14.777057 ip-10-0-137-144 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 16:21:14.780564 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.780466 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 16:21:14.783788 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783770 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:14.783788 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783788 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783792 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783795 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783798 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783801 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783805 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783808 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783810 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783818 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783821 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783824 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783827 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783830 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783832 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783835 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783838 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783853 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783856 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783860 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783863 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:14.783871 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783866 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783868 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783871 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783873 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783876 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783879 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783883 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783886 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783888 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783891 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783893 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783896 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783898 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783901 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783903 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783906 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783909 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783911 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783914 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783917 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:14.784351 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783919 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783924 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783928 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783931 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783934 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783937 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783940 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783942 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783945 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783949 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783951 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783954 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783957 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783959 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783963 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783966 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783969 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783971 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783974 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:14.784829 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783977 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783979 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783982 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783984 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783987 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783989 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783993 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783995 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.783997 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784000 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784003 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784005 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784008 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784010 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784016 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784020 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784023 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784026 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784029 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:14.785309 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784032 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784035 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784037 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784040 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784043 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784045 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784048 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784442 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784448 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784450 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784454 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784456 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784459 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784462 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784464 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784467 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784469 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784472 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784475 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784477 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:14.785766 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784481 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784485 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784487 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784490 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784493 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784496 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784498 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784501 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784504 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784507 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784509 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784512 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784515 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784519 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784522 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784524 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784527 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784529 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784532 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:14.786260 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784534 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784537 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784539 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784541 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784544 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784547 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784549 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784552 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784555 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784557 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784559 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784562 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784564 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784567 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784569 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784573 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784575 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784578 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784582 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:14.786767 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784585 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784589 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784592 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784595 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784598 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784600 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784603 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784606 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784609 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784611 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784614 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784617 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784620 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784622 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784625 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784628 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784631 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784633 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784636 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784638 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:14.787265 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784640 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784643 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784646 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784648 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784651 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784653 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784656 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784658 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784661 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784667 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784670 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784672 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784675 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784677 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.784680 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786022 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786035 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786042 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786048 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786053 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786056 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 16:21:14.787752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786062 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786066 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786070 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786073 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786077 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786080 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786083 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786086 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786089 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786093 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786095 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786098 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786101 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786105 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786108 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786111 2578 flags.go:64] FLAG: --config-dir="" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786114 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786117 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786122 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786125 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786131 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786134 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786137 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786140 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 16:21:14.788303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786144 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786147 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786150 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786155 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786158 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786162 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786165 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786169 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786172 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786176 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786179 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786182 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786186 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786189 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786193 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786196 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786200 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786203 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786206 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786209 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786212 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786215 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786223 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786226 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786229 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 16:21:14.788894 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786233 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786237 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786241 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786245 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786249 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786252 2578 flags.go:64] FLAG: --help="false" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786255 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-137-144.ec2.internal" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786258 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786262 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786265 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786269 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786272 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786276 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786279 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786282 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786285 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786288 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786292 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786294 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786297 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786300 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786304 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786307 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786310 2578 flags.go:64] FLAG: --lock-file="" Apr 22 16:21:14.789493 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786313 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786316 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786319 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786329 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786332 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786335 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786338 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786341 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786345 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786348 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786351 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786357 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786360 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786365 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786368 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786371 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786374 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786377 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786380 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786383 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786386 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786394 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786398 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786401 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 16:21:14.790132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786404 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786407 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786412 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786415 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786418 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786421 2578 flags.go:64] FLAG: --port="10250" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786425 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786428 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0a07b9b52cde15d65" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786431 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786435 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786438 2578 flags.go:64] FLAG: --register-node="true" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786440 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786443 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786447 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786450 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786453 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786456 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786459 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786462 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786466 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786470 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786473 2578 flags.go:64] FLAG: --runonce="false" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786476 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786480 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786483 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 16:21:14.790709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786486 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786489 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786492 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786495 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786499 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786502 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786506 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786509 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786511 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786514 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786518 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786521 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786526 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786529 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786532 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786535 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786538 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786541 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786544 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786547 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786550 2578 flags.go:64] FLAG: --v="2" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786554 2578 flags.go:64] FLAG: --version="false" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786559 2578 flags.go:64] FLAG: --vmodule="" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786563 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.786566 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 16:21:14.791367 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786669 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786673 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786677 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786680 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786683 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786685 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786688 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786692 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786696 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786698 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786701 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786704 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786708 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786712 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786715 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786718 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786720 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786724 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786728 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:14.791989 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786731 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786734 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786737 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786740 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786743 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786746 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786749 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786751 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786754 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786757 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786760 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786763 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786765 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786768 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786770 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786774 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786777 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786780 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786783 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786786 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:14.792449 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786788 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786809 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786813 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786816 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786819 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786822 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786824 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786827 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786830 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786833 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786836 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786852 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786855 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786858 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786861 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786863 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786866 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786869 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786871 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786874 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:14.793060 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786877 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786879 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786882 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786885 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786887 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786890 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786892 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786896 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786899 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786901 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786904 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786907 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786909 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786912 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786914 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786917 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786920 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786922 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786925 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786927 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:14.793826 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786931 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786934 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786936 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786941 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786943 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786946 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.786949 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:14.794442 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.787723 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:14.795574 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.795553 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 16:21:14.795616 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.795576 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795626 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795632 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795635 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795638 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795641 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795645 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795648 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:14.795650 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795651 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795655 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795658 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795662 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795666 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795669 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795672 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795675 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795678 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795680 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795683 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795686 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795688 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795696 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795700 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795702 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795705 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795708 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795710 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:14.795860 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795713 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795716 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795718 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795721 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795723 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795726 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795729 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795731 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795734 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795736 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795739 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795742 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795744 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795747 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795749 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795753 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795756 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795760 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795762 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795765 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:14.796365 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795767 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795770 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795773 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795775 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795778 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795780 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795782 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795786 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795791 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795794 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795797 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795800 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795803 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795806 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795809 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795811 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795814 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795817 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795819 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:14.796883 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795822 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795824 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795827 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795830 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795833 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795835 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795838 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795856 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795862 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795866 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795869 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795873 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795876 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795878 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795881 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795884 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795886 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795889 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795892 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:14.797353 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795894 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.795897 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.795902 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796000 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796005 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796009 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796011 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796014 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796017 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796019 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796022 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796025 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796027 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796030 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796032 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796035 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 16:21:14.797832 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796038 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796040 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796043 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796046 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796049 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796052 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796054 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796057 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796060 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796064 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796067 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796069 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796072 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796075 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796077 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796080 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796082 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796085 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796087 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796090 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 16:21:14.798242 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796093 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796095 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796098 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796101 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796103 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796106 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796109 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796111 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796114 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796116 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796120 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796124 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796127 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796129 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796132 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796135 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796138 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796141 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796144 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 16:21:14.798724 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796147 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796150 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796153 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796157 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796159 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796162 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796164 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796167 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796169 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796172 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796175 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796177 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796180 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796182 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796185 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796187 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796190 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796193 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796195 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 16:21:14.799317 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796198 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796201 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796205 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796208 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796210 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796213 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796216 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796218 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796220 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796223 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796226 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796229 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796232 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796234 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:14.796237 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 16:21:14.799777 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.796242 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 16:21:14.800167 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.796952 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 16:21:14.800167 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.799006 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 16:21:14.800167 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.799948 2578 server.go:1019] "Starting client certificate rotation" Apr 22 16:21:14.800167 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.800046 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 16:21:14.800167 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.800081 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 16:21:14.822803 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.822782 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 16:21:14.825717 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.825698 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 16:21:14.843973 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.843946 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 16:21:14.849492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.849474 2578 log.go:25] "Validated CRI v1 image API" Apr 22 16:21:14.851781 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.851761 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 16:21:14.854397 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.854375 2578 fs.go:135] Filesystem UUIDs: map[08898d9a-8bec-45a0-baec-339915bb943b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f8c67fe5-1f3a-4806-abf2-cbc62996b293:/dev/nvme0n1p3] Apr 22 16:21:14.854472 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.854396 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 16:21:14.858942 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.858917 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 16:21:14.862192 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.862080 2578 manager.go:217] Machine: {Timestamp:2026-04-22 16:21:14.86018304 +0000 UTC m=+0.396913743 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092517 MemoryCapacity:33164480512 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2c3ac8f7098079461065db774b5d00 SystemUUID:ec2c3ac8-f709-8079-4610-65db774b5d00 BootID:83fbcdff-0b9a-478a-82f1-4646624d1d51 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582238208 Type:vfs Inodes:4048398 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:93:19:d6:f4:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:93:19:d6:f4:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:2a:5d:f5:bb:49:6c Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164480512 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 16:21:14.862192 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.862181 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 16:21:14.862362 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.862295 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 16:21:14.864026 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.863996 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 16:21:14.864197 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.864030 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-144.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 16:21:14.864276 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.864213 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 16:21:14.864276 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.864226 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 16:21:14.864276 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.864244 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 16:21:14.865896 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.865882 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 16:21:14.867266 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.867254 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 16:21:14.867408 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.867397 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 16:21:14.870107 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.870096 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 16:21:14.870163 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.870114 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 16:21:14.870163 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.870134 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 16:21:14.870163 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.870149 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 16:21:14.870163 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.870160 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 16:21:14.871227 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.871214 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 16:21:14.871291 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.871237 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 16:21:14.874224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.874202 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 16:21:14.875651 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.875638 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 16:21:14.877382 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877367 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 16:21:14.877382 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877385 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877392 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877398 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877403 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877409 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877415 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877421 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877428 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877435 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877452 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 16:21:14.877482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.877461 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 16:21:14.878502 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.878493 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 16:21:14.878502 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.878502 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 16:21:14.881971 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.881955 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 16:21:14.882065 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.881993 2578 server.go:1295] "Started kubelet" Apr 22 16:21:14.882153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.882098 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 16:21:14.882191 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.882150 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 16:21:14.882191 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.882181 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 16:21:14.882972 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.882951 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 16:21:14.883046 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.883006 2578 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-144.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 22 16:21:14.883062 ip-10-0-137-144 systemd[1]: Started Kubernetes Kubelet. Apr 22 16:21:14.883174 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.883104 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 16:21:14.883364 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.883349 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 16:21:14.884830 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.884815 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 16:21:14.887962 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.886907 2578 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-144.ec2.internal.18a8ba48cfa806b2 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-144.ec2.internal,UID:ip-10-0-137-144.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-144.ec2.internal,},FirstTimestamp:2026-04-22 16:21:14.881967794 +0000 UTC m=+0.418698497,LastTimestamp:2026-04-22 16:21:14.881967794 +0000 UTC m=+0.418698497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-144.ec2.internal,}" Apr 22 16:21:14.889305 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.889289 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 16:21:14.889865 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.889828 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 16:21:14.890450 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.890431 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 16:21:14.890628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.890615 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 16:21:14.890687 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.890621 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 16:21:14.890687 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.890668 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 16:21:14.890785 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.890732 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 16:21:14.890785 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.890742 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 16:21:14.890885 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.890817 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:14.892275 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.892090 2578 factory.go:55] Registering systemd factory Apr 22 16:21:14.892378 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.892334 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 16:21:14.892378 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.892160 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 22 16:21:14.892885 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.892871 2578 factory.go:153] Registering CRI-O factory Apr 22 16:21:14.892956 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.892889 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 16:21:14.892993 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.892980 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 16:21:14.893022 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.893009 2578 factory.go:103] Registering Raw factory Apr 22 16:21:14.893060 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.893026 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 16:21:14.893717 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.893697 2578 manager.go:319] Starting recovery of all containers Apr 22 16:21:14.897357 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.897333 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9qhw2" Apr 22 16:21:14.903369 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.903340 2578 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-137-144.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 22 16:21:14.904363 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.904340 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9qhw2" Apr 22 16:21:14.905498 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.905481 2578 manager.go:324] Recovery completed Apr 22 16:21:14.907098 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.907074 2578 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 22 16:21:14.910005 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.909945 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:14.914790 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.914773 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:14.914880 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.914801 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:14.914880 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.914819 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:14.915439 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.915423 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 16:21:14.915439 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.915435 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 16:21:14.915529 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.915470 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 16:21:14.917664 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.917646 2578 policy_none.go:49] "None policy: Start" Apr 22 16:21:14.917664 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.917663 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 16:21:14.917806 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.917677 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 16:21:14.961520 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.961497 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.961541 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.961552 2578 server.go:85] "Starting device plugin registration server" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.961854 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.961867 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.961968 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.962037 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:14.962043 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.962678 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 16:21:14.963912 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:14.962716 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.014198 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.014153 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 16:21:15.015578 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.015561 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 16:21:15.015679 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.015592 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 16:21:15.015679 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.015615 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 16:21:15.015679 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.015625 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 16:21:15.015679 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.015672 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 16:21:15.017814 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.017793 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:15.062497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.062404 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:15.063423 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.063405 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:15.063519 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.063435 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:15.063519 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.063446 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:15.063519 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.063471 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.072524 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.072506 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.072575 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.072530 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-144.ec2.internal\": node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.090942 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.090916 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.115744 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.115714 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal"] Apr 22 16:21:15.115823 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.115792 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:15.116729 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.116713 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:15.116815 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.116741 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:15.116815 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.116756 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:15.118035 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118020 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:15.118196 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.118247 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118217 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:15.118749 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118732 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:15.118805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118754 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:15.118805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118766 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:15.118805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118774 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:15.118805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118778 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:15.118805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.118784 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:15.119909 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.119895 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.119961 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.119922 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 16:21:15.120540 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.120527 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientMemory" Apr 22 16:21:15.120606 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.120550 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 16:21:15.120606 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.120565 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeHasSufficientPID" Apr 22 16:21:15.153465 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.153439 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-144.ec2.internal\" not found" node="ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.157833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.157814 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-144.ec2.internal\" not found" node="ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.191167 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.191139 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.192335 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.192317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/829392402d3f1636d4c2da09261dbdd6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal\" (UID: \"829392402d3f1636d4c2da09261dbdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.192396 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.192348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/829392402d3f1636d4c2da09261dbdd6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal\" (UID: \"829392402d3f1636d4c2da09261dbdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.192396 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.192370 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec42b4252b8124bf29765450697dabd4-config\") pod \"kube-apiserver-proxy-ip-10-0-137-144.ec2.internal\" (UID: \"ec42b4252b8124bf29765450697dabd4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.291386 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.291352 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.292480 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.292459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/829392402d3f1636d4c2da09261dbdd6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal\" (UID: \"829392402d3f1636d4c2da09261dbdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.292573 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.292493 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/829392402d3f1636d4c2da09261dbdd6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal\" (UID: \"829392402d3f1636d4c2da09261dbdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.292573 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.292518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec42b4252b8124bf29765450697dabd4-config\") pod \"kube-apiserver-proxy-ip-10-0-137-144.ec2.internal\" (UID: \"ec42b4252b8124bf29765450697dabd4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.292573 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.292552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/829392402d3f1636d4c2da09261dbdd6-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal\" (UID: \"829392402d3f1636d4c2da09261dbdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.292573 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.292553 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ec42b4252b8124bf29765450697dabd4-config\") pod \"kube-apiserver-proxy-ip-10-0-137-144.ec2.internal\" (UID: \"ec42b4252b8124bf29765450697dabd4\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.292720 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.292554 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/829392402d3f1636d4c2da09261dbdd6-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal\" (UID: \"829392402d3f1636d4c2da09261dbdd6\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.392039 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.391966 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.455275 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.455235 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.460881 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.460854 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" Apr 22 16:21:15.492883 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.492854 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.593306 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.593269 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.693875 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.693786 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.707860 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.707815 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:15.794049 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.794012 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.800226 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.800208 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 16:21:15.800349 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.800328 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:15.800396 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.800374 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 16:21:15.889696 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.889666 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 16:21:15.894503 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.894478 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.902605 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.902583 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 16:21:15.906826 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.906785 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 16:16:14 +0000 UTC" deadline="2027-09-28 06:25:36.060326973 +0000 UTC" Apr 22 16:21:15.906942 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.906826 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12566h4m20.153505626s" Apr 22 16:21:15.922532 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.922501 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2slj2" Apr 22 16:21:15.930017 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.929994 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2slj2" Apr 22 16:21:15.994610 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:15.994580 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:15.994859 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:15.994819 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod829392402d3f1636d4c2da09261dbdd6.slice/crio-55942e7978a012ff5a586d6c64a2abc6be2e72326805ecec5ab5b76c8ce9ca3c WatchSource:0}: Error finding container 55942e7978a012ff5a586d6c64a2abc6be2e72326805ecec5ab5b76c8ce9ca3c: Status 404 returned error can't find the container with id 55942e7978a012ff5a586d6c64a2abc6be2e72326805ecec5ab5b76c8ce9ca3c Apr 22 16:21:15.995049 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:15.995031 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec42b4252b8124bf29765450697dabd4.slice/crio-e06ec247cbce759c643b85a95330360516a3252e392a0a0b5ce383859f83ce4f WatchSource:0}: Error finding container e06ec247cbce759c643b85a95330360516a3252e392a0a0b5ce383859f83ce4f: Status 404 returned error can't find the container with id e06ec247cbce759c643b85a95330360516a3252e392a0a0b5ce383859f83ce4f Apr 22 16:21:15.998602 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:15.998585 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:21:16.018257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.018210 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" event={"ID":"829392402d3f1636d4c2da09261dbdd6","Type":"ContainerStarted","Data":"55942e7978a012ff5a586d6c64a2abc6be2e72326805ecec5ab5b76c8ce9ca3c"} Apr 22 16:21:16.019109 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.019089 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" event={"ID":"ec42b4252b8124bf29765450697dabd4","Type":"ContainerStarted","Data":"e06ec247cbce759c643b85a95330360516a3252e392a0a0b5ce383859f83ce4f"} Apr 22 16:21:16.095475 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:16.095445 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-144.ec2.internal\" not found" Apr 22 16:21:16.110816 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.110793 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:16.190559 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.190514 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" Apr 22 16:21:16.201935 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.201908 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 16:21:16.202969 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.202956 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" Apr 22 16:21:16.210691 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.210676 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 16:21:16.282814 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.282749 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:16.836167 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.836136 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:16.871039 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.871012 2578 apiserver.go:52] "Watching apiserver" Apr 22 16:21:16.876709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.876686 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 16:21:16.877143 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.877123 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-s5rnp","openshift-multus/multus-jjbms","openshift-multus/network-metrics-daemon-zzttm","openshift-network-diagnostics/network-check-target-l44cq","openshift-network-operator/iptables-alerter-k7mlb","openshift-ovn-kubernetes/ovnkube-node-glprs","kube-system/konnectivity-agent-vphs5","kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg","openshift-cluster-node-tuning-operator/tuned-kjcg9","openshift-image-registry/node-ca-xzk9d","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal"] Apr 22 16:21:16.879215 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.879187 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.880304 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.880275 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.881513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881319 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 16:21:16.881513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881415 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:16.881513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881485 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 16:21:16.881513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:16.881500 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:16.881812 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881632 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-tx5sg\"" Apr 22 16:21:16.881812 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881742 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 16:21:16.881812 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881778 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 16:21:16.882096 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.881817 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 16:21:16.882196 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.882174 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 16:21:16.882196 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.882184 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-8j2dt\"" Apr 22 16:21:16.882619 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.882602 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:16.882702 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:16.882663 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:16.888692 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.888668 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:16.889921 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.889903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.890719 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.890698 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 16:21:16.890827 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.890724 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-r28mx\"" Apr 22 16:21:16.890827 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.890740 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 16:21:16.891208 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.891186 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:21:16.894210 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.894190 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:16.894805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.894770 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 16:21:16.894928 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.894870 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 16:21:16.895002 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.894781 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 16:21:16.895184 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.895164 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-h4nm5\"" Apr 22 16:21:16.895184 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.895181 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 16:21:16.895331 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.895310 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 16:21:16.895383 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.895328 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 16:21:16.896312 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.896290 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 16:21:16.896407 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.896367 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-x4zlt\"" Apr 22 16:21:16.896695 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.896663 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 16:21:16.896968 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.896949 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:16.898522 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.898504 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:16.899018 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.898998 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 16:21:16.899018 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899009 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 16:21:16.899298 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899279 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-7g8r8\"" Apr 22 16:21:16.899395 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-ovnkube-script-lib\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.899395 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4098aef1-2fba-4928-ac90-0a2b8fdc8510-konnectivity-ca\") pod \"konnectivity-agent-vphs5\" (UID: \"4098aef1-2fba-4928-ac90-0a2b8fdc8510\") " pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:16.899395 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899358 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-cnibin\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.899395 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899375 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-cnibin\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.899395 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899389 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-k8s-cni-cncf-io\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899405 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-kubelet\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899418 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-multus-certs\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899446 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c216f795-9679-4c21-86f2-d69c0900b7f2-host-slash\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-cni-netd\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899474 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-os-release\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94rtw\" (UniqueName: \"kubernetes.io/projected/308b982f-cb30-4a62-92f1-e88ca12b210e-kube-api-access-94rtw\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899542 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899562 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-slash\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.899631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899609 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-node-log\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899671 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-ovnkube-config\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899714 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-system-cni-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899742 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-socket-dir-parent\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899779 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-cni-multus\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899831 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-hostroot\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899877 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-conf-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qtw2m\" (UniqueName: \"kubernetes.io/projected/c216f795-9679-4c21-86f2-d69c0900b7f2-kube-api-access-qtw2m\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899929 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-run-netns\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.899993 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-cni-binary-copy\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900017 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-etc-kubernetes\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900040 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c216f795-9679-4c21-86f2-d69c0900b7f2-iptables-alerter-script\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:16.900091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-var-lib-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900104 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-etc-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-log-socket\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900152 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-run-ovn-kubernetes\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900295 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-cni-bin\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900547 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-bqrwn\"" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900607 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 16:21:16.900766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900678 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 16:21:16.901113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.901113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900961 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-daemon-config\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.901113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.900986 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:16.901113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-systemd-units\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-env-overrides\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttdz4\" (UniqueName: \"kubernetes.io/projected/8a355c35-cd73-4888-9d7b-1841477e589c-kube-api-access-ttdz4\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4098aef1-2fba-4928-ac90-0a2b8fdc8510-agent-certs\") pod \"konnectivity-agent-vphs5\" (UID: \"4098aef1-2fba-4928-ac90-0a2b8fdc8510\") " pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901154 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-cni-binary-copy\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901179 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901204 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwt9b\" (UniqueName: \"kubernetes.io/projected/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-kube-api-access-rwt9b\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-cni-bin\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901280 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-kubelet\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901305 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-systemd\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901369 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901353 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-ovn\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.901818 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901378 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-system-cni-dir\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:16.901818 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901403 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-cni-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.901818 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901428 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-os-release\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.901818 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-netns\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:16.901818 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901617 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdfg9\" (UniqueName: \"kubernetes.io/projected/01d73bcf-a30e-4dfb-ab2d-863123f999c7-kube-api-access-cdfg9\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:16.901818 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.901640 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a355c35-cd73-4888-9d7b-1841477e589c-ovn-node-metrics-cert\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:16.902410 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.902036 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d5dct\"" Apr 22 16:21:16.902410 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.902267 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 16:21:16.902410 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.902370 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 16:21:16.902521 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.902438 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 16:21:16.930838 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.930809 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 16:16:15 +0000 UTC" deadline="2028-02-03 19:48:20.136572886 +0000 UTC" Apr 22 16:21:16.930952 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.930857 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15651h27m3.205736388s" Apr 22 16:21:16.991517 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:16.991492 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 16:21:17.002656 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-cni-multus\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-hostroot\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-conf-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002724 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-cni-multus\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002733 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-hostroot\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002775 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-run-netns\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002790 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-conf-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.002808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002802 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-var-lib-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002835 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-run-netns\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-etc-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002874 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-var-lib-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-cni-bin\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002914 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-etc-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4098aef1-2fba-4928-ac90-0a2b8fdc8510-agent-certs\") pod \"konnectivity-agent-vphs5\" (UID: \"4098aef1-2fba-4928-ac90-0a2b8fdc8510\") " pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.002922 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-cni-bin\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.002968 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-cni-binary-copy\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.003050 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:17.502997264 +0000 UTC m=+3.039727969 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003098 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-etc-kubernetes\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-log-socket\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-run-ovn-kubernetes\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003181 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-etc-kubernetes\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-run-ovn-kubernetes\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.003224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-log-socket\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysctl-d\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003277 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-systemd\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003311 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-lib-modules\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003338 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-daemon-config\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003418 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-systemd-units\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003470 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e6d4c51-d843-4eca-9406-7639d52380a0-host\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003516 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31675dd5-4b14-424a-bc7e-27f7fe4405e5-tmp\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003562 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-cni-binary-copy\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003586 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-device-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-run\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-cni-binary-copy\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003643 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-var-lib-kubelet\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-kubelet\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.004006 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003785 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-ovn\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-system-cni-dir\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003907 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-os-release\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-netns\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004002 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-system-cni-dir\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.003964 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cdfg9\" (UniqueName: \"kubernetes.io/projected/01d73bcf-a30e-4dfb-ab2d-863123f999c7-kube-api-access-cdfg9\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004230 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-ovnkube-script-lib\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004287 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4098aef1-2fba-4928-ac90-0a2b8fdc8510-konnectivity-ca\") pod \"konnectivity-agent-vphs5\" (UID: \"4098aef1-2fba-4928-ac90-0a2b8fdc8510\") " pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004317 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-tuned\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-cnibin\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.004671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004399 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-cnibin\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.005122 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.004898 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-daemon-config\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.005122 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-cni-binary-copy\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.005216 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005173 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/4098aef1-2fba-4928-ac90-0a2b8fdc8510-konnectivity-ca\") pod \"konnectivity-agent-vphs5\" (UID: \"4098aef1-2fba-4928-ac90-0a2b8fdc8510\") " pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:17.005266 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005233 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c216f795-9679-4c21-86f2-d69c0900b7f2-host-slash\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.005328 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005312 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-openvswitch\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.005375 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-ovn\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.005441 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005427 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-os-release\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.005485 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-netns\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.005524 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.005515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-kubelet\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006072 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-ovnkube-script-lib\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006163 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-systemd-units\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006210 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-socket-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.006253 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006240 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-kubernetes\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.006287 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006278 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-env-overrides\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006325 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006311 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-os-release\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.006405 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006390 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.006437 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006428 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-slash\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006478 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006460 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-node-log\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006511 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006543 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006524 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-etc-selinux\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.006573 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006561 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-sys-fs\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.006665 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006651 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-system-cni-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.006700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006686 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-socket-dir-parent\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.006732 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qtw2m\" (UniqueName: \"kubernetes.io/projected/c216f795-9679-4c21-86f2-d69c0900b7f2-kube-api-access-qtw2m\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.006763 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-host\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.006799 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c216f795-9679-4c21-86f2-d69c0900b7f2-iptables-alerter-script\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.006838 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006814 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e6d4c51-d843-4eca-9406-7639d52380a0-serviceca\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.006893 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-modprobe-d\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.006934 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006920 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttdz4\" (UniqueName: \"kubernetes.io/projected/8a355c35-cd73-4888-9d7b-1841477e589c-kube-api-access-ttdz4\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.006972 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006956 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-registration-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.007020 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.006999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.007055 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007032 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rwt9b\" (UniqueName: \"kubernetes.io/projected/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-kube-api-access-rwt9b\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.007083 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007057 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-cni-bin\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.007228 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007212 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-systemd\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.007287 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007247 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpznk\" (UniqueName: \"kubernetes.io/projected/3e6d4c51-d843-4eca-9406-7639d52380a0-kube-api-access-xpznk\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.007287 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007281 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-cni-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.007387 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007301 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-system-cni-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.007387 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a355c35-cd73-4888-9d7b-1841477e589c-ovn-node-metrics-cert\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.007387 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007368 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c216f795-9679-4c21-86f2-d69c0900b7f2-host-slash\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.007750 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007726 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-node-log\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.007830 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007779 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.007830 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007802 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-socket-dir-parent\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.007958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007864 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-kubelet-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.007958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007901 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysctl-conf\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.007958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007931 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-sys\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.008400 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/c216f795-9679-4c21-86f2-d69c0900b7f2-iptables-alerter-script\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.008488 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-cnibin\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.008488 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008477 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-k8s-cni-cncf-io\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.008600 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-kubelet\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.008655 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/308b982f-cb30-4a62-92f1-e88ca12b210e-os-release\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.008711 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008689 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-cni-bin\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.008875 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-multus-certs\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.008989 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008897 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-cni-netd\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.008989 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-var-lib-kubelet\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.008989 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008939 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhcjq\" (UniqueName: \"kubernetes.io/projected/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-kube-api-access-nhcjq\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.008989 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.008975 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-php52\" (UniqueName: \"kubernetes.io/projected/31675dd5-4b14-424a-bc7e-27f7fe4405e5-kube-api-access-php52\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.009173 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009053 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-94rtw\" (UniqueName: \"kubernetes.io/projected/308b982f-cb30-4a62-92f1-e88ca12b210e-kube-api-access-94rtw\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.009173 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009127 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-run-systemd\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.009260 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-k8s-cni-cncf-io\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.009260 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009176 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-multus-cni-dir\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.009260 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009126 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-cnibin\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.009260 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009229 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-host-run-multus-certs\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.009435 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009269 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-cni-netd\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.009435 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009362 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8a355c35-cd73-4888-9d7b-1841477e589c-host-slash\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.009435 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.007734 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-env-overrides\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.009749 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-ovnkube-config\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.009749 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009666 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/4098aef1-2fba-4928-ac90-0a2b8fdc8510-agent-certs\") pod \"konnectivity-agent-vphs5\" (UID: \"4098aef1-2fba-4928-ac90-0a2b8fdc8510\") " pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:17.009749 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009727 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysconfig\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.009949 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.009893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.010812 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.010606 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8a355c35-cd73-4888-9d7b-1841477e589c-ovn-node-metrics-cert\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.010969 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.010917 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:17.010969 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.010948 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:17.010969 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.010961 2578 projected.go:194] Error preparing data for projected volume kube-api-access-99cfl for pod openshift-network-diagnostics/network-check-target-l44cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:17.011161 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.011076 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl podName:cc090485-3533-4014-9e56-2a28800c3d78 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:17.511025716 +0000 UTC m=+3.047756418 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-99cfl" (UniqueName: "kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl") pod "network-check-target-l44cq" (UID: "cc090485-3533-4014-9e56-2a28800c3d78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:17.011161 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.011028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8a355c35-cd73-4888-9d7b-1841477e589c-ovnkube-config\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.011368 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.011344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/308b982f-cb30-4a62-92f1-e88ca12b210e-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.013285 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.013265 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdfg9\" (UniqueName: \"kubernetes.io/projected/01d73bcf-a30e-4dfb-ab2d-863123f999c7-kube-api-access-cdfg9\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:17.017162 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.017136 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-94rtw\" (UniqueName: \"kubernetes.io/projected/308b982f-cb30-4a62-92f1-e88ca12b210e-kube-api-access-94rtw\") pod \"multus-additional-cni-plugins-s5rnp\" (UID: \"308b982f-cb30-4a62-92f1-e88ca12b210e\") " pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.017363 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.017342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qtw2m\" (UniqueName: \"kubernetes.io/projected/c216f795-9679-4c21-86f2-d69c0900b7f2-kube-api-access-qtw2m\") pod \"iptables-alerter-k7mlb\" (UID: \"c216f795-9679-4c21-86f2-d69c0900b7f2\") " pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.018565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.018542 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwt9b\" (UniqueName: \"kubernetes.io/projected/7d4f23e4-772f-4bf2-86a3-25eb1a3cc274-kube-api-access-rwt9b\") pod \"multus-jjbms\" (UID: \"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274\") " pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.018665 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.018584 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttdz4\" (UniqueName: \"kubernetes.io/projected/8a355c35-cd73-4888-9d7b-1841477e589c-kube-api-access-ttdz4\") pod \"ovnkube-node-glprs\" (UID: \"8a355c35-cd73-4888-9d7b-1841477e589c\") " pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.110896 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.110780 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-tuned\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.110896 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.110827 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-socket-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.110896 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.110868 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-kubernetes\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.110896 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.110895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-etc-selinux\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.110958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-etc-selinux\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.110995 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-kubernetes\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111044 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-sys-fs\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111076 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-host\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e6d4c51-d843-4eca-9406-7639d52380a0-serviceca\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-sys-fs\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111120 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-socket-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111126 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-modprobe-d\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-host\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111195 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-registration-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111226 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-registration-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xpznk\" (UniqueName: \"kubernetes.io/projected/3e6d4c51-d843-4eca-9406-7639d52380a0-kube-api-access-xpznk\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-kubelet-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111285 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysctl-conf\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111312 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-sys\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111346 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhcjq\" (UniqueName: \"kubernetes.io/projected/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-kube-api-access-nhcjq\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-modprobe-d\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-php52\" (UniqueName: \"kubernetes.io/projected/31675dd5-4b14-424a-bc7e-27f7fe4405e5-kube-api-access-php52\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111499 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysctl-conf\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111529 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysconfig\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-sys\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111546 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3e6d4c51-d843-4eca-9406-7639d52380a0-serviceca\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysconfig\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111596 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysctl-d\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.111628 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111632 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-kubelet-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-systemd\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111721 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-lib-modules\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111819 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-lib-modules\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111818 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-systemd\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-sysctl-d\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111876 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e6d4c51-d843-4eca-9406-7639d52380a0-host\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111905 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31675dd5-4b14-424a-bc7e-27f7fe4405e5-tmp\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-device-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111938 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e6d4c51-d843-4eca-9406-7639d52380a0-host\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-run\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111988 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-device-dir\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.111981 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-var-lib-kubelet\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.112019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-run\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.112361 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.112070 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/31675dd5-4b14-424a-bc7e-27f7fe4405e5-var-lib-kubelet\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.113435 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.113412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/31675dd5-4b14-424a-bc7e-27f7fe4405e5-etc-tuned\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.114133 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.114111 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/31675dd5-4b14-424a-bc7e-27f7fe4405e5-tmp\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.119333 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.119307 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-php52\" (UniqueName: \"kubernetes.io/projected/31675dd5-4b14-424a-bc7e-27f7fe4405e5-kube-api-access-php52\") pod \"tuned-kjcg9\" (UID: \"31675dd5-4b14-424a-bc7e-27f7fe4405e5\") " pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.119697 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.119674 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpznk\" (UniqueName: \"kubernetes.io/projected/3e6d4c51-d843-4eca-9406-7639d52380a0-kube-api-access-xpznk\") pod \"node-ca-xzk9d\" (UID: \"3e6d4c51-d843-4eca-9406-7639d52380a0\") " pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.120304 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.120283 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhcjq\" (UniqueName: \"kubernetes.io/projected/264ba0e2-a7b9-4b2d-9864-8ba2f4727492-kube-api-access-nhcjq\") pod \"aws-ebs-csi-driver-node-krctg\" (UID: \"264ba0e2-a7b9-4b2d-9864-8ba2f4727492\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.197371 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.197337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" Apr 22 16:21:17.205224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.205196 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-jjbms" Apr 22 16:21:17.215011 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.214908 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-k7mlb" Apr 22 16:21:17.220694 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.220669 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:17.229343 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.229318 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:17.234933 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.234911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" Apr 22 16:21:17.241608 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.241589 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" Apr 22 16:21:17.247165 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.247143 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-xzk9d" Apr 22 16:21:17.516117 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.516027 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:17.516117 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.516083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:17.516332 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.516182 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:17.516332 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.516204 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:17.516332 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.516221 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:17.516332 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.516235 2578 projected.go:194] Error preparing data for projected volume kube-api-access-99cfl for pod openshift-network-diagnostics/network-check-target-l44cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:17.516332 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.516255 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:18.516239659 +0000 UTC m=+4.052970352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:17.516332 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:17.516281 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl podName:cc090485-3533-4014-9e56-2a28800c3d78 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:18.516267639 +0000 UTC m=+4.052998342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-99cfl" (UniqueName: "kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl") pod "network-check-target-l44cq" (UID: "cc090485-3533-4014-9e56-2a28800c3d78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:17.878537 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.878387 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a355c35_cd73_4888_9d7b_1841477e589c.slice/crio-0a51979087ba9c99db1107c38da03bdf1e4b9b7fb8fa37c72c3f51c32febb169 WatchSource:0}: Error finding container 0a51979087ba9c99db1107c38da03bdf1e4b9b7fb8fa37c72c3f51c32febb169: Status 404 returned error can't find the container with id 0a51979087ba9c99db1107c38da03bdf1e4b9b7fb8fa37c72c3f51c32febb169 Apr 22 16:21:17.879422 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.879397 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31675dd5_4b14_424a_bc7e_27f7fe4405e5.slice/crio-cd837fe6c358a1d2639f925703232ff4fdab03c74478ca563e483f487505b38a WatchSource:0}: Error finding container cd837fe6c358a1d2639f925703232ff4fdab03c74478ca563e483f487505b38a: Status 404 returned error can't find the container with id cd837fe6c358a1d2639f925703232ff4fdab03c74478ca563e483f487505b38a Apr 22 16:21:17.880311 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.880097 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod264ba0e2_a7b9_4b2d_9864_8ba2f4727492.slice/crio-e3efc69983ec3eca37edb5ab30797bc782b8c8e25a8988da137d33127d6b9668 WatchSource:0}: Error finding container e3efc69983ec3eca37edb5ab30797bc782b8c8e25a8988da137d33127d6b9668: Status 404 returned error can't find the container with id e3efc69983ec3eca37edb5ab30797bc782b8c8e25a8988da137d33127d6b9668 Apr 22 16:21:17.883679 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.883653 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d4f23e4_772f_4bf2_86a3_25eb1a3cc274.slice/crio-cd4f060af6fe0e4b354c32f5bfe9964a694f1b694347ba605a06eb65cece9fdf WatchSource:0}: Error finding container cd4f060af6fe0e4b354c32f5bfe9964a694f1b694347ba605a06eb65cece9fdf: Status 404 returned error can't find the container with id cd4f060af6fe0e4b354c32f5bfe9964a694f1b694347ba605a06eb65cece9fdf Apr 22 16:21:17.884402 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.884380 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc216f795_9679_4c21_86f2_d69c0900b7f2.slice/crio-2d0bb8b4630523eeaaa9bfe88926d221f79576e5df072dec4b5bed0af7ece4ae WatchSource:0}: Error finding container 2d0bb8b4630523eeaaa9bfe88926d221f79576e5df072dec4b5bed0af7ece4ae: Status 404 returned error can't find the container with id 2d0bb8b4630523eeaaa9bfe88926d221f79576e5df072dec4b5bed0af7ece4ae Apr 22 16:21:17.885648 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.885617 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4098aef1_2fba_4928_ac90_0a2b8fdc8510.slice/crio-78c671d710c359c21ce9fa018a3edade86d5e230d139c0c547557539bbcb5457 WatchSource:0}: Error finding container 78c671d710c359c21ce9fa018a3edade86d5e230d139c0c547557539bbcb5457: Status 404 returned error can't find the container with id 78c671d710c359c21ce9fa018a3edade86d5e230d139c0c547557539bbcb5457 Apr 22 16:21:17.887958 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.887630 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod308b982f_cb30_4a62_92f1_e88ca12b210e.slice/crio-cb1d927facf5ae887ac8e2a175a0da816a66be0ed556abb8c9892b2efe6b6f0a WatchSource:0}: Error finding container cb1d927facf5ae887ac8e2a175a0da816a66be0ed556abb8c9892b2efe6b6f0a: Status 404 returned error can't find the container with id cb1d927facf5ae887ac8e2a175a0da816a66be0ed556abb8c9892b2efe6b6f0a Apr 22 16:21:17.888651 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:17.888632 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e6d4c51_d843_4eca_9406_7639d52380a0.slice/crio-f1d53bb73914baefc0f711d30ab460097ae5274a88d9ea7f2e8c9cac3f21bc7d WatchSource:0}: Error finding container f1d53bb73914baefc0f711d30ab460097ae5274a88d9ea7f2e8c9cac3f21bc7d: Status 404 returned error can't find the container with id f1d53bb73914baefc0f711d30ab460097ae5274a88d9ea7f2e8c9cac3f21bc7d Apr 22 16:21:17.931738 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.931703 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 16:16:15 +0000 UTC" deadline="2028-01-11 02:59:02.378250296 +0000 UTC" Apr 22 16:21:17.931738 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:17.931731 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15082h37m44.446522325s" Apr 22 16:21:18.022336 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.022287 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" event={"ID":"ec42b4252b8124bf29765450697dabd4","Type":"ContainerStarted","Data":"73194c9b447131cc3813ae380848741ae4e98d7618eaf1808289cadc114175d0"} Apr 22 16:21:18.023337 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.023306 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xzk9d" event={"ID":"3e6d4c51-d843-4eca-9406-7639d52380a0","Type":"ContainerStarted","Data":"f1d53bb73914baefc0f711d30ab460097ae5274a88d9ea7f2e8c9cac3f21bc7d"} Apr 22 16:21:18.024963 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.024936 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerStarted","Data":"cb1d927facf5ae887ac8e2a175a0da816a66be0ed556abb8c9892b2efe6b6f0a"} Apr 22 16:21:18.025970 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.025945 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vphs5" event={"ID":"4098aef1-2fba-4928-ac90-0a2b8fdc8510","Type":"ContainerStarted","Data":"78c671d710c359c21ce9fa018a3edade86d5e230d139c0c547557539bbcb5457"} Apr 22 16:21:18.026883 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.026861 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k7mlb" event={"ID":"c216f795-9679-4c21-86f2-d69c0900b7f2","Type":"ContainerStarted","Data":"2d0bb8b4630523eeaaa9bfe88926d221f79576e5df072dec4b5bed0af7ece4ae"} Apr 22 16:21:18.027878 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.027857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"0a51979087ba9c99db1107c38da03bdf1e4b9b7fb8fa37c72c3f51c32febb169"} Apr 22 16:21:18.028855 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.028821 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jjbms" event={"ID":"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274","Type":"ContainerStarted","Data":"cd4f060af6fe0e4b354c32f5bfe9964a694f1b694347ba605a06eb65cece9fdf"} Apr 22 16:21:18.029715 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.029698 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" event={"ID":"264ba0e2-a7b9-4b2d-9864-8ba2f4727492","Type":"ContainerStarted","Data":"e3efc69983ec3eca37edb5ab30797bc782b8c8e25a8988da137d33127d6b9668"} Apr 22 16:21:18.030576 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.030557 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" event={"ID":"31675dd5-4b14-424a-bc7e-27f7fe4405e5","Type":"ContainerStarted","Data":"cd837fe6c358a1d2639f925703232ff4fdab03c74478ca563e483f487505b38a"} Apr 22 16:21:18.035127 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.035080 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-144.ec2.internal" podStartSLOduration=2.035065864 podStartE2EDuration="2.035065864s" podCreationTimestamp="2026-04-22 16:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:21:18.034518805 +0000 UTC m=+3.571249517" watchObservedRunningTime="2026-04-22 16:21:18.035065864 +0000 UTC m=+3.571796577" Apr 22 16:21:18.191041 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.190932 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 16:21:18.524497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.524408 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:18.524497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:18.524473 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:18.524740 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:18.524656 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:18.524740 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:18.524677 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:18.524740 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:18.524689 2578 projected.go:194] Error preparing data for projected volume kube-api-access-99cfl for pod openshift-network-diagnostics/network-check-target-l44cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:18.524919 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:18.524749 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl podName:cc090485-3533-4014-9e56-2a28800c3d78 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:20.524730883 +0000 UTC m=+6.061461587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-99cfl" (UniqueName: "kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl") pod "network-check-target-l44cq" (UID: "cc090485-3533-4014-9e56-2a28800c3d78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:18.525237 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:18.525216 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:18.525326 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:18.525296 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:20.525256809 +0000 UTC m=+6.061987500 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:19.017539 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:19.017007 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:19.017539 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:19.017167 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:19.018334 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:19.018182 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:19.018334 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:19.018279 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:19.041745 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:19.041117 2578 generic.go:358] "Generic (PLEG): container finished" podID="829392402d3f1636d4c2da09261dbdd6" containerID="7c8269002a3d90efe28e6214aef18940c5a9da509f2ce7f58fbd47b00b1eef6e" exitCode=0 Apr 22 16:21:19.041745 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:19.041674 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" event={"ID":"829392402d3f1636d4c2da09261dbdd6","Type":"ContainerDied","Data":"7c8269002a3d90efe28e6214aef18940c5a9da509f2ce7f58fbd47b00b1eef6e"} Apr 22 16:21:20.055352 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:20.055256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" event={"ID":"829392402d3f1636d4c2da09261dbdd6","Type":"ContainerStarted","Data":"4105fa0ca94d364c858526f49aa4bc65f25a3c6c74d6d5d3c34f9c5a787bf10f"} Apr 22 16:21:20.541328 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:20.541292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:20.541508 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:20.541351 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:20.541508 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:20.541441 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:20.541508 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:20.541461 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:20.541508 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:20.541474 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:20.541508 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:20.541487 2578 projected.go:194] Error preparing data for projected volume kube-api-access-99cfl for pod openshift-network-diagnostics/network-check-target-l44cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:20.541508 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:20.541509 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:24.541490686 +0000 UTC m=+10.078221692 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:20.541803 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:20.541527 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl podName:cc090485-3533-4014-9e56-2a28800c3d78 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:24.541517415 +0000 UTC m=+10.078248109 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-99cfl" (UniqueName: "kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl") pod "network-check-target-l44cq" (UID: "cc090485-3533-4014-9e56-2a28800c3d78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:21.016292 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:21.016255 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:21.016456 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:21.016403 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:21.016805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:21.016653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:21.016805 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:21.016762 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:23.016664 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:23.016630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:23.017175 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:23.016630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:23.017175 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:23.016787 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:23.017175 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:23.016867 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:24.573979 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:24.573927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:24.574432 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:24.574018 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:24.574432 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:24.574152 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:24.574432 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:24.574218 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:32.574198853 +0000 UTC m=+18.110929545 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:24.574647 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:24.574623 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:24.574647 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:24.574643 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:24.574750 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:24.574656 2578 projected.go:194] Error preparing data for projected volume kube-api-access-99cfl for pod openshift-network-diagnostics/network-check-target-l44cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:24.574750 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:24.574718 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl podName:cc090485-3533-4014-9e56-2a28800c3d78 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:32.57470253 +0000 UTC m=+18.111433225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-99cfl" (UniqueName: "kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl") pod "network-check-target-l44cq" (UID: "cc090485-3533-4014-9e56-2a28800c3d78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:25.016790 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:25.016665 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:25.016790 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:25.016785 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:25.017035 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:25.016926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:25.017090 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:25.017026 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:27.016128 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.016035 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:27.016532 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:27.016154 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:27.016532 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.016212 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:27.016532 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:27.016330 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:27.537537 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.537487 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-144.ec2.internal" podStartSLOduration=11.537472084000001 podStartE2EDuration="11.537472084s" podCreationTimestamp="2026-04-22 16:21:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:21:20.069042096 +0000 UTC m=+5.605772809" watchObservedRunningTime="2026-04-22 16:21:27.537472084 +0000 UTC m=+13.074202795" Apr 22 16:21:27.538182 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.538161 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-dl86z"] Apr 22 16:21:27.541150 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.541128 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.543507 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.543484 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wzh6z\"" Apr 22 16:21:27.543615 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.543546 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 16:21:27.543615 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.543484 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 16:21:27.599829 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.599790 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b918f41f-7884-40cb-ac36-9c716d27d92f-hosts-file\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.600002 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.599838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b918f41f-7884-40cb-ac36-9c716d27d92f-tmp-dir\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.600002 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.599879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5qjl\" (UniqueName: \"kubernetes.io/projected/b918f41f-7884-40cb-ac36-9c716d27d92f-kube-api-access-j5qjl\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.701132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.701094 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b918f41f-7884-40cb-ac36-9c716d27d92f-hosts-file\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.701308 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.701140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b918f41f-7884-40cb-ac36-9c716d27d92f-tmp-dir\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.701308 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.701239 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/b918f41f-7884-40cb-ac36-9c716d27d92f-hosts-file\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.701308 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.701259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j5qjl\" (UniqueName: \"kubernetes.io/projected/b918f41f-7884-40cb-ac36-9c716d27d92f-kube-api-access-j5qjl\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.701482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.701458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b918f41f-7884-40cb-ac36-9c716d27d92f-tmp-dir\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.711733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.711706 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5qjl\" (UniqueName: \"kubernetes.io/projected/b918f41f-7884-40cb-ac36-9c716d27d92f-kube-api-access-j5qjl\") pod \"node-resolver-dl86z\" (UID: \"b918f41f-7884-40cb-ac36-9c716d27d92f\") " pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:27.850873 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:27.850823 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-dl86z" Apr 22 16:21:29.016304 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:29.016268 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:29.016712 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:29.016400 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:29.016712 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:29.016437 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:29.016712 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:29.016513 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:31.016742 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:31.016710 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:31.017224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:31.016712 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:31.017224 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:31.016827 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:31.017224 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:31.016937 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:32.631544 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:32.631494 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:32.631551 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:32.631673 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:32.631687 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:32.631708 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:32.631722 2578 projected.go:194] Error preparing data for projected volume kube-api-access-99cfl for pod openshift-network-diagnostics/network-check-target-l44cq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:32.631747 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.63172707 +0000 UTC m=+34.168457760 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:32.632036 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:32.631771 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl podName:cc090485-3533-4014-9e56-2a28800c3d78 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.6317561 +0000 UTC m=+34.168486790 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-99cfl" (UniqueName: "kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl") pod "network-check-target-l44cq" (UID: "cc090485-3533-4014-9e56-2a28800c3d78") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 16:21:33.016815 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:33.016723 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:33.016815 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:33.016769 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:33.017055 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:33.016884 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:33.017055 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:33.017012 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:34.656195 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:34.656170 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb918f41f_7884_40cb_ac36_9c716d27d92f.slice/crio-4d85a1b14eb502b139fc258091c2a913840178421f07fa0ed35af3508fb339ce WatchSource:0}: Error finding container 4d85a1b14eb502b139fc258091c2a913840178421f07fa0ed35af3508fb339ce: Status 404 returned error can't find the container with id 4d85a1b14eb502b139fc258091c2a913840178421f07fa0ed35af3508fb339ce Apr 22 16:21:35.017411 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.017172 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:35.017541 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.017520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:35.017746 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:35.017717 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:35.018279 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:35.018252 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:35.080812 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.080774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-xzk9d" event={"ID":"3e6d4c51-d843-4eca-9406-7639d52380a0","Type":"ContainerStarted","Data":"486bdd8bdf29425d04e44a14c3d9be86a22acf635c074d27e303ad2bbd123d24"} Apr 22 16:21:35.082017 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.081984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerStarted","Data":"56b2b714c89d595a0694f5150a3b3c28e544c51aba04584e8e3928006c6b2569"} Apr 22 16:21:35.083197 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.083173 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-vphs5" event={"ID":"4098aef1-2fba-4928-ac90-0a2b8fdc8510","Type":"ContainerStarted","Data":"f415931060ec5688fb8e3c0c9457b201c69091142a6cdc3629e4f23d7e374ae4"} Apr 22 16:21:35.084658 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.084598 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"d3da8fd468cc15fffdb8d05ff1eb64b09ac1fa7de086bb53d27796bd1f523d1f"} Apr 22 16:21:35.084744 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.084664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"c69354fce1c11004cf4781a4ca0547fff2f1df19a995d788d1b8774de678b021"} Apr 22 16:21:35.085985 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.085964 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-jjbms" event={"ID":"7d4f23e4-772f-4bf2-86a3-25eb1a3cc274","Type":"ContainerStarted","Data":"6bddc39bb3f267317817d795dc007f4639de1c8929c253c7c948d4a3528f7e91"} Apr 22 16:21:35.087217 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.087198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" event={"ID":"264ba0e2-a7b9-4b2d-9864-8ba2f4727492","Type":"ContainerStarted","Data":"1c1f3d4493125048c21627a66a425433024ecdd323f1013fe102281b14a16b25"} Apr 22 16:21:35.088473 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.088450 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" event={"ID":"31675dd5-4b14-424a-bc7e-27f7fe4405e5","Type":"ContainerStarted","Data":"95f93b829aae6c55162989812aca41e90a3eeb0b7378cc068ac14f35f054431a"} Apr 22 16:21:35.089830 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.089809 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl86z" event={"ID":"b918f41f-7884-40cb-ac36-9c716d27d92f","Type":"ContainerStarted","Data":"223bc315f536efde4a7e64e0306cad2795806fd1241bf5d4879731f655eb25c7"} Apr 22 16:21:35.089942 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.089839 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-dl86z" event={"ID":"b918f41f-7884-40cb-ac36-9c716d27d92f","Type":"ContainerStarted","Data":"4d85a1b14eb502b139fc258091c2a913840178421f07fa0ed35af3508fb339ce"} Apr 22 16:21:35.108979 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.108927 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-xzk9d" podStartSLOduration=7.970671275 podStartE2EDuration="20.108908093s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.891139668 +0000 UTC m=+3.427870358" lastFinishedPulling="2026-04-22 16:21:30.029376486 +0000 UTC m=+15.566107176" observedRunningTime="2026-04-22 16:21:35.108734556 +0000 UTC m=+20.645465292" watchObservedRunningTime="2026-04-22 16:21:35.108908093 +0000 UTC m=+20.645638801" Apr 22 16:21:35.119797 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.119614 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-vphs5" podStartSLOduration=3.394577935 podStartE2EDuration="20.119600612s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.887947584 +0000 UTC m=+3.424678275" lastFinishedPulling="2026-04-22 16:21:34.612970248 +0000 UTC m=+20.149700952" observedRunningTime="2026-04-22 16:21:35.119282862 +0000 UTC m=+20.656013573" watchObservedRunningTime="2026-04-22 16:21:35.119600612 +0000 UTC m=+20.656331324" Apr 22 16:21:35.132580 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.132542 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kjcg9" podStartSLOduration=3.361310164 podStartE2EDuration="20.132504802s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.881759313 +0000 UTC m=+3.418490007" lastFinishedPulling="2026-04-22 16:21:34.652953934 +0000 UTC m=+20.189684645" observedRunningTime="2026-04-22 16:21:35.132141514 +0000 UTC m=+20.668872227" watchObservedRunningTime="2026-04-22 16:21:35.132504802 +0000 UTC m=+20.669235514" Apr 22 16:21:35.146394 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.146352 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-dl86z" podStartSLOduration=8.146338514 podStartE2EDuration="8.146338514s" podCreationTimestamp="2026-04-22 16:21:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 16:21:35.14620285 +0000 UTC m=+20.682933573" watchObservedRunningTime="2026-04-22 16:21:35.146338514 +0000 UTC m=+20.683069226" Apr 22 16:21:35.161431 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:35.161388 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-jjbms" podStartSLOduration=3.391116067 podStartE2EDuration="20.161373293s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.885113585 +0000 UTC m=+3.421844280" lastFinishedPulling="2026-04-22 16:21:34.655370802 +0000 UTC m=+20.192101506" observedRunningTime="2026-04-22 16:21:35.161049679 +0000 UTC m=+20.697780391" watchObservedRunningTime="2026-04-22 16:21:35.161373293 +0000 UTC m=+20.698104023" Apr 22 16:21:36.063827 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.063675 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 16:21:36.094436 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094408 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:21:36.094728 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094705 2578 generic.go:358] "Generic (PLEG): container finished" podID="8a355c35-cd73-4888-9d7b-1841477e589c" containerID="d3da8fd468cc15fffdb8d05ff1eb64b09ac1fa7de086bb53d27796bd1f523d1f" exitCode=1 Apr 22 16:21:36.094805 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerDied","Data":"d3da8fd468cc15fffdb8d05ff1eb64b09ac1fa7de086bb53d27796bd1f523d1f"} Apr 22 16:21:36.094881 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"33bb1a202621b7ec68c284f5334f3450c22a7b0fad75053abe78522ba4f38a31"} Apr 22 16:21:36.094881 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094814 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"b6baccebde47e32487324b259f35677b2d8e01f421f73b0af49f9dafd7b2c345"} Apr 22 16:21:36.094881 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"efe8510d6a36ab578749b97fd5fa4857450373b85bcc8b382cb363b1f3cdc614"} Apr 22 16:21:36.094881 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.094837 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"1cb0cd3dd90b99884185f292b73ca6fe44506d6a9bf7c62dbdd2c00c5ceb337f"} Apr 22 16:21:36.096220 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.096197 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" event={"ID":"264ba0e2-a7b9-4b2d-9864-8ba2f4727492","Type":"ContainerStarted","Data":"ac16163760d6d2fee3537697c0e3ce5634cddcd2639f98de16ee5aaf4bcddddd"} Apr 22 16:21:36.097437 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.097415 2578 generic.go:358] "Generic (PLEG): container finished" podID="308b982f-cb30-4a62-92f1-e88ca12b210e" containerID="56b2b714c89d595a0694f5150a3b3c28e544c51aba04584e8e3928006c6b2569" exitCode=0 Apr 22 16:21:36.097571 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.097537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerDied","Data":"56b2b714c89d595a0694f5150a3b3c28e544c51aba04584e8e3928006c6b2569"} Apr 22 16:21:36.300477 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.300445 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:36.300995 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.300978 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:36.810926 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.810899 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-vzf4h"] Apr 22 16:21:36.817907 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.817881 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:36.818056 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:36.817962 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vzf4h" podUID="05520239-9b93-4ae7-abd6-fd7042ec092f" Apr 22 16:21:36.965062 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.965029 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05520239-9b93-4ae7-abd6-fd7042ec092f-kubelet-config\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:36.965298 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.965074 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:36.965298 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.965192 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05520239-9b93-4ae7-abd6-fd7042ec092f-dbus\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:36.972108 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.972001 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T16:21:36.063815144Z","UUID":"399db35e-9e42-4ce5-b9ab-6149e9ba5ac2","Handler":null,"Name":"","Endpoint":""} Apr 22 16:21:36.975574 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.975551 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 16:21:36.975686 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:36.975583 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 16:21:37.016266 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.016230 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:37.016430 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.016411 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:37.016495 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:37.016472 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:37.016546 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:37.016507 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:37.066350 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.066277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05520239-9b93-4ae7-abd6-fd7042ec092f-kubelet-config\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:37.066350 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.066313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:37.066946 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.066371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05520239-9b93-4ae7-abd6-fd7042ec092f-dbus\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:37.066946 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.066454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/05520239-9b93-4ae7-abd6-fd7042ec092f-kubelet-config\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:37.066946 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:37.066500 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:37.066946 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:37.066576 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret podName:05520239-9b93-4ae7-abd6-fd7042ec092f nodeName:}" failed. No retries permitted until 2026-04-22 16:21:37.566555465 +0000 UTC m=+23.103286155 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret") pod "global-pull-secret-syncer-vzf4h" (UID: "05520239-9b93-4ae7-abd6-fd7042ec092f") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:37.066946 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.066630 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/05520239-9b93-4ae7-abd6-fd7042ec092f-dbus\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:37.100262 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.100234 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-k7mlb" event={"ID":"c216f795-9679-4c21-86f2-d69c0900b7f2","Type":"ContainerStarted","Data":"362542a49c222091a441a3b7fe40dc925c78b62308601bb2dbc0fb2e8295f2d8"} Apr 22 16:21:37.100460 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.100445 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:37.100868 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.100832 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-vphs5" Apr 22 16:21:37.114013 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.113973 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-k7mlb" podStartSLOduration=5.387259626 podStartE2EDuration="22.113960179s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.886325938 +0000 UTC m=+3.423056630" lastFinishedPulling="2026-04-22 16:21:34.613026475 +0000 UTC m=+20.149757183" observedRunningTime="2026-04-22 16:21:37.113724835 +0000 UTC m=+22.650455557" watchObservedRunningTime="2026-04-22 16:21:37.113960179 +0000 UTC m=+22.650690933" Apr 22 16:21:37.569784 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:37.569749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:37.569960 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:37.569938 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:37.570032 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:37.570020 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret podName:05520239-9b93-4ae7-abd6-fd7042ec092f nodeName:}" failed. No retries permitted until 2026-04-22 16:21:38.569999045 +0000 UTC m=+24.106729739 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret") pod "global-pull-secret-syncer-vzf4h" (UID: "05520239-9b93-4ae7-abd6-fd7042ec092f") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:38.104695 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:38.104653 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" event={"ID":"264ba0e2-a7b9-4b2d-9864-8ba2f4727492","Type":"ContainerStarted","Data":"a45a838493b9ffc5624091ba846ded33f3346651ff6105dac8fd644e2b671bcb"} Apr 22 16:21:38.108007 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:38.107982 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:21:38.108403 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:38.108377 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"96f28c9356f28572cd7196bc9f12d23cc2f5654428b29b13a6cc78366b5e3972"} Apr 22 16:21:38.120811 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:38.120762 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-krctg" podStartSLOduration=3.2842249150000002 podStartE2EDuration="23.120744646s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.882147994 +0000 UTC m=+3.418878684" lastFinishedPulling="2026-04-22 16:21:37.718667706 +0000 UTC m=+23.255398415" observedRunningTime="2026-04-22 16:21:38.120127015 +0000 UTC m=+23.656857728" watchObservedRunningTime="2026-04-22 16:21:38.120744646 +0000 UTC m=+23.657475359" Apr 22 16:21:38.577788 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:38.577756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:38.577977 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:38.577907 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:38.578040 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:38.577978 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret podName:05520239-9b93-4ae7-abd6-fd7042ec092f nodeName:}" failed. No retries permitted until 2026-04-22 16:21:40.577958874 +0000 UTC m=+26.114689572 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret") pod "global-pull-secret-syncer-vzf4h" (UID: "05520239-9b93-4ae7-abd6-fd7042ec092f") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:39.016714 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:39.016629 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:39.016714 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:39.016655 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:39.016714 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:39.016630 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:39.017016 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:39.016777 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vzf4h" podUID="05520239-9b93-4ae7-abd6-fd7042ec092f" Apr 22 16:21:39.017016 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:39.016865 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:39.017016 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:39.016938 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:40.595391 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:40.595356 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:40.596103 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:40.595514 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:40.596103 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:40.595593 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret podName:05520239-9b93-4ae7-abd6-fd7042ec092f nodeName:}" failed. No retries permitted until 2026-04-22 16:21:44.595572782 +0000 UTC m=+30.132303472 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret") pod "global-pull-secret-syncer-vzf4h" (UID: "05520239-9b93-4ae7-abd6-fd7042ec092f") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:41.016134 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.015917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:41.016251 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:41.016228 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vzf4h" podUID="05520239-9b93-4ae7-abd6-fd7042ec092f" Apr 22 16:21:41.016318 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.015917 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:41.016374 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.015921 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:41.016445 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:41.016420 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:41.016562 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:41.016512 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:41.116143 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.116106 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerStarted","Data":"0ad9e1739c49547a88dcd15ccf1411f2c87e71be49ed33f2a3712c76de78eb54"} Apr 22 16:21:41.119475 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.119443 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:21:41.119800 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.119774 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"ad9979d9fdf4e6f1118915f2a8db80ebecab6b82b133fa0ae3631503b7cbb53d"} Apr 22 16:21:41.120194 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.120172 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:41.120466 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.120447 2578 scope.go:117] "RemoveContainer" containerID="d3da8fd468cc15fffdb8d05ff1eb64b09ac1fa7de086bb53d27796bd1f523d1f" Apr 22 16:21:41.136809 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:41.136785 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:42.122700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.122666 2578 generic.go:358] "Generic (PLEG): container finished" podID="308b982f-cb30-4a62-92f1-e88ca12b210e" containerID="0ad9e1739c49547a88dcd15ccf1411f2c87e71be49ed33f2a3712c76de78eb54" exitCode=0 Apr 22 16:21:42.123186 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.122741 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerDied","Data":"0ad9e1739c49547a88dcd15ccf1411f2c87e71be49ed33f2a3712c76de78eb54"} Apr 22 16:21:42.126025 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.126006 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:21:42.126307 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.126290 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" event={"ID":"8a355c35-cd73-4888-9d7b-1841477e589c","Type":"ContainerStarted","Data":"a5021bcd306015535156864407da8d60601882b0533fb041eb4ebb45fb80ba28"} Apr 22 16:21:42.126540 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.126521 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:42.126540 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.126543 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:42.140729 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.140709 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:21:42.167371 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.167308 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" podStartSLOduration=10.350346157 podStartE2EDuration="27.167289061s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.880465508 +0000 UTC m=+3.417196213" lastFinishedPulling="2026-04-22 16:21:34.697408425 +0000 UTC m=+20.234139117" observedRunningTime="2026-04-22 16:21:42.166192014 +0000 UTC m=+27.702922727" watchObservedRunningTime="2026-04-22 16:21:42.167289061 +0000 UTC m=+27.704019775" Apr 22 16:21:42.829497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.829444 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vzf4h"] Apr 22 16:21:42.829673 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.829631 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:42.829788 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:42.829757 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vzf4h" podUID="05520239-9b93-4ae7-abd6-fd7042ec092f" Apr 22 16:21:42.830172 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.830148 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zzttm"] Apr 22 16:21:42.830286 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.830273 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:42.830413 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:42.830391 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:42.830693 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.830661 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l44cq"] Apr 22 16:21:42.830799 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:42.830782 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:42.830920 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:42.830896 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:44.016804 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:44.016566 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:44.017183 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:44.016823 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:44.131656 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:44.131624 2578 generic.go:358] "Generic (PLEG): container finished" podID="308b982f-cb30-4a62-92f1-e88ca12b210e" containerID="9b9d510a560283643e94e8086f9e3dcb679e860087569452f0ccc250c2993ec4" exitCode=0 Apr 22 16:21:44.131804 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:44.131718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerDied","Data":"9b9d510a560283643e94e8086f9e3dcb679e860087569452f0ccc250c2993ec4"} Apr 22 16:21:44.627817 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:44.627775 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:44.627983 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:44.627942 2578 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:44.628021 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:44.628006 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret podName:05520239-9b93-4ae7-abd6-fd7042ec092f nodeName:}" failed. No retries permitted until 2026-04-22 16:21:52.627990682 +0000 UTC m=+38.164721372 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret") pod "global-pull-secret-syncer-vzf4h" (UID: "05520239-9b93-4ae7-abd6-fd7042ec092f") : object "kube-system"/"original-pull-secret" not registered Apr 22 16:21:45.017391 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:45.017363 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:45.018205 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:45.017454 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:45.018205 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:45.017551 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:45.018205 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:45.017740 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vzf4h" podUID="05520239-9b93-4ae7-abd6-fd7042ec092f" Apr 22 16:21:45.134833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:45.134797 2578 generic.go:358] "Generic (PLEG): container finished" podID="308b982f-cb30-4a62-92f1-e88ca12b210e" containerID="2652d448d066d0b7e9c1900d65f83e7a9dcc7c4ba93617b1bc2736dcdad9bb1e" exitCode=0 Apr 22 16:21:45.134996 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:45.134883 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerDied","Data":"2652d448d066d0b7e9c1900d65f83e7a9dcc7c4ba93617b1bc2736dcdad9bb1e"} Apr 22 16:21:46.016390 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:46.016306 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:46.016546 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:46.016436 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-l44cq" podUID="cc090485-3533-4014-9e56-2a28800c3d78" Apr 22 16:21:47.016796 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.016762 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:47.017267 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.016809 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:47.017267 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:47.016926 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:21:47.017267 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:47.017060 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-vzf4h" podUID="05520239-9b93-4ae7-abd6-fd7042ec092f" Apr 22 16:21:47.769366 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.769296 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-144.ec2.internal" event="NodeReady" Apr 22 16:21:47.769586 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.769447 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 16:21:47.801697 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.801659 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54"] Apr 22 16:21:47.806139 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.806112 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.808054 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.807999 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh"] Apr 22 16:21:47.810616 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.810591 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 16:21:47.812377 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812331 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 22 16:21:47.812480 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812396 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 16:21:47.812480 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812409 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 22 16:21:47.812592 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812540 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 22 16:21:47.812649 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812337 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 16:21:47.812787 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812763 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx"] Apr 22 16:21:47.812907 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812897 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 22 16:21:47.812977 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.812946 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:47.814916 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.814896 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 22 16:21:47.815206 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.815183 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xxqn8\"" Apr 22 16:21:47.815317 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.815185 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 22 16:21:47.815649 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.815621 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-7bcbd5754d-txtx9"] Apr 22 16:21:47.815910 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.815837 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:47.821137 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.818911 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 16:21:47.821137 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.819173 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-lszhs\"" Apr 22 16:21:47.821137 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.819597 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2"] Apr 22 16:21:47.821137 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.819686 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.822681 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.822661 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 16:21:47.822781 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.822759 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54"] Apr 22 16:21:47.822859 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.822786 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh"] Apr 22 16:21:47.822859 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.822800 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx"] Apr 22 16:21:47.822859 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.822823 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-2fjmx"] Apr 22 16:21:47.823228 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.823210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 16:21:47.823416 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.823399 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-r8rzs\"" Apr 22 16:21:47.823752 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.823737 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 16:21:47.825958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.825938 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2"] Apr 22 16:21:47.826043 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.825964 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-7bcbd5754d-txtx9"] Apr 22 16:21:47.826043 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.825979 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wt7t4"] Apr 22 16:21:47.826043 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.826006 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:47.826208 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.826103 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:47.828230 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.828213 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 22 16:21:47.828465 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.828448 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 16:21:47.828549 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.828478 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 16:21:47.828949 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.828926 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:47.829100 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.829066 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fsrq7\"" Apr 22 16:21:47.829717 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.829692 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 16:21:47.831315 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.831296 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 16:21:47.831630 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.831615 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 16:21:47.831945 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.831926 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 16:21:47.832063 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.832041 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pdvf2\"" Apr 22 16:21:47.837789 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.837765 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2fjmx"] Apr 22 16:21:47.839423 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.838623 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wt7t4"] Apr 22 16:21:47.954590 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954548 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqlm7\" (UniqueName: \"kubernetes.io/projected/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-kube-api-access-lqlm7\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.954590 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954593 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.954866 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954620 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-installation-pull-secrets\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.954866 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954706 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7sbgp\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-kube-api-access-7sbgp\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.954866 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954758 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:47.954866 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954793 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-hub\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.954866 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954816 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954889 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/de2e09dd-d655-4750-a773-af55bcb94210-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.954938 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hkjrp\" (UniqueName: \"kubernetes.io/projected/ff0b5af2-329b-48fe-9749-12adef9c10ea-kube-api-access-hkjrp\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955006 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-ca-trust-extracted\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955038 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-trusted-ca\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-certificates\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.955153 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955149 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/c9eaf512-b7e4-4868-82bc-2b3036304589-kube-api-access-7qfqw\") pod \"managed-serviceaccount-addon-agent-78697c9f4f-82zjx\" (UID: \"c9eaf512-b7e4-4868-82bc-2b3036304589\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-bound-sa-token\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b427d2da-7345-4266-8029-a5e4953ca8db-config-volume\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955240 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955282 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-ca\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955330 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b427d2da-7345-4266-8029-a5e4953ca8db-tmp-dir\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955373 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pbsj\" (UniqueName: \"kubernetes.io/projected/a17fc401-3295-45ad-8e4e-b5c7a99047fd-kube-api-access-9pbsj\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ff0b5af2-329b-48fe-9749-12adef9c10ea-klusterlet-config\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:47.955429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955426 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-image-registry-private-configuration\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:47.955733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955455 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:47.955733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955488 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:47.955733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9eaf512-b7e4-4868-82bc-2b3036304589-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78697c9f4f-82zjx\" (UID: \"c9eaf512-b7e4-4868-82bc-2b3036304589\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:47.955733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955545 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff0b5af2-329b-48fe-9749-12adef9c10ea-tmp\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:47.955733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:47.955568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7f5lc\" (UniqueName: \"kubernetes.io/projected/b427d2da-7345-4266-8029-a5e4953ca8db-kube-api-access-7f5lc\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.016013 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.015975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:48.018534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.018505 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 16:21:48.018534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.018514 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-phtvm\"" Apr 22 16:21:48.018998 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.018568 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 16:21:48.056713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.056680 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.056713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.056718 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-installation-pull-secrets\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.056992 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.056747 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7sbgp\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-kube-api-access-7sbgp\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.056992 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.056833 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:21:48.056992 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.056863 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:21:48.056992 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.056933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.056992 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.056942 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.556922826 +0000 UTC m=+34.093653520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.056993 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-hub\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/de2e09dd-d655-4750-a773-af55bcb94210-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hkjrp\" (UniqueName: \"kubernetes.io/projected/ff0b5af2-329b-48fe-9749-12adef9c10ea-kube-api-access-hkjrp\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-ca-trust-extracted\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.057108 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:21:48.057235 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.057186 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.557171096 +0000 UTC m=+34.093901790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057366 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-trusted-ca\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057391 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-certificates\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/c9eaf512-b7e4-4868-82bc-2b3036304589-kube-api-access-7qfqw\") pod \"managed-serviceaccount-addon-agent-78697c9f4f-82zjx\" (UID: \"c9eaf512-b7e4-4868-82bc-2b3036304589\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-bound-sa-token\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057460 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b427d2da-7345-4266-8029-a5e4953ca8db-config-volume\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057487 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-ca\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.057565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-ca-trust-extracted\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.057594 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.057637 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.557621881 +0000 UTC m=+34.094352577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057663 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b427d2da-7345-4266-8029-a5e4953ca8db-tmp-dir\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057706 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pbsj\" (UniqueName: \"kubernetes.io/projected/a17fc401-3295-45ad-8e4e-b5c7a99047fd-kube-api-access-9pbsj\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057735 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ff0b5af2-329b-48fe-9749-12adef9c10ea-klusterlet-config\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057767 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-image-registry-private-configuration\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.057958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057801 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/de2e09dd-d655-4750-a773-af55bcb94210-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:48.058235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058065 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-certificates\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.058235 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058118 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/b427d2da-7345-4266-8029-a5e4953ca8db-tmp-dir\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.058364 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.057805 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.058499 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058364 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-trusted-ca\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b427d2da-7345-4266-8029-a5e4953ca8db-config-volume\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058817 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9eaf512-b7e4-4868-82bc-2b3036304589-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78697c9f4f-82zjx\" (UID: \"c9eaf512-b7e4-4868-82bc-2b3036304589\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058900 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff0b5af2-329b-48fe-9749-12adef9c10ea-tmp\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7f5lc\" (UniqueName: \"kubernetes.io/projected/b427d2da-7345-4266-8029-a5e4953ca8db-kube-api-access-7f5lc\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.058977 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqlm7\" (UniqueName: \"kubernetes.io/projected/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-kube-api-access-lqlm7\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.059238 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.059454 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:48.559439065 +0000 UTC m=+34.096169754 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:21:48.059700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.059486 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ff0b5af2-329b-48fe-9749-12adef9c10ea-tmp\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.062919 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.062871 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ff0b5af2-329b-48fe-9749-12adef9c10ea-klusterlet-config\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.063131 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.063105 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.063429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.063402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-ca\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.063535 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.063441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-installation-pull-secrets\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.063535 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.063479 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-image-registry-private-configuration\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.064483 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.064458 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-hub\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.065835 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.065807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.066274 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.066227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hkjrp\" (UniqueName: \"kubernetes.io/projected/ff0b5af2-329b-48fe-9749-12adef9c10ea-kube-api-access-hkjrp\") pod \"klusterlet-addon-workmgr-f66c9b7f6-rpfq2\" (UID: \"ff0b5af2-329b-48fe-9749-12adef9c10ea\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.066548 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.066522 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7sbgp\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-kube-api-access-7sbgp\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.066948 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.066929 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-bound-sa-token\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.067489 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.067466 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pbsj\" (UniqueName: \"kubernetes.io/projected/a17fc401-3295-45ad-8e4e-b5c7a99047fd-kube-api-access-9pbsj\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:48.068392 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.068371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7f5lc\" (UniqueName: \"kubernetes.io/projected/b427d2da-7345-4266-8029-a5e4953ca8db-kube-api-access-7f5lc\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.068515 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.068481 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qfqw\" (UniqueName: \"kubernetes.io/projected/c9eaf512-b7e4-4868-82bc-2b3036304589-kube-api-access-7qfqw\") pod \"managed-serviceaccount-addon-agent-78697c9f4f-82zjx\" (UID: \"c9eaf512-b7e4-4868-82bc-2b3036304589\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:48.068709 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.068690 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c9eaf512-b7e4-4868-82bc-2b3036304589-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-78697c9f4f-82zjx\" (UID: \"c9eaf512-b7e4-4868-82bc-2b3036304589\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:48.068787 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.068768 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqlm7\" (UniqueName: \"kubernetes.io/projected/7d557b69-3d0f-445b-85b2-5427dc4c6f4a-kube-api-access-lqlm7\") pod \"cluster-proxy-proxy-agent-5b5746cdf4-q5j54\" (UID: \"7d557b69-3d0f-445b-85b2-5427dc4c6f4a\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.131394 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.131354 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:21:48.150150 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.150117 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" Apr 22 16:21:48.176859 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.176380 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:48.307922 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.307872 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx"] Apr 22 16:21:48.311860 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.311808 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54"] Apr 22 16:21:48.313795 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:48.313763 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc9eaf512_b7e4_4868_82bc_2b3036304589.slice/crio-1584879d41911a2efc7400fb1c948827f845202649e21c61354eb01ca0d882ed WatchSource:0}: Error finding container 1584879d41911a2efc7400fb1c948827f845202649e21c61354eb01ca0d882ed: Status 404 returned error can't find the container with id 1584879d41911a2efc7400fb1c948827f845202649e21c61354eb01ca0d882ed Apr 22 16:21:48.315987 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:48.315823 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7d557b69_3d0f_445b_85b2_5427dc4c6f4a.slice/crio-d75c1fdbb12c573c88d67b21f3b8e7fc87a4a565d8e6695fdcee23b0e864c3cb WatchSource:0}: Error finding container d75c1fdbb12c573c88d67b21f3b8e7fc87a4a565d8e6695fdcee23b0e864c3cb: Status 404 returned error can't find the container with id d75c1fdbb12c573c88d67b21f3b8e7fc87a4a565d8e6695fdcee23b0e864c3cb Apr 22 16:21:48.329175 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.329126 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2"] Apr 22 16:21:48.334431 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:48.334395 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff0b5af2_329b_48fe_9749_12adef9c10ea.slice/crio-e7e520ab651a96af7e3339a678cede1ab7c37a9841f60e0b191a10a6bfdd872d WatchSource:0}: Error finding container e7e520ab651a96af7e3339a678cede1ab7c37a9841f60e0b191a10a6bfdd872d: Status 404 returned error can't find the container with id e7e520ab651a96af7e3339a678cede1ab7c37a9841f60e0b191a10a6bfdd872d Apr 22 16:21:48.564394 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.564308 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:48.564394 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.564372 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.564416 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.564459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564470 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564547 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:49.564526947 +0000 UTC m=+35.101257657 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564548 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564585 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:49.564573427 +0000 UTC m=+35.101304117 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564548 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:21:48.564608 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564614 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:21:49.564606531 +0000 UTC m=+35.101337223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:21:48.565002 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564633 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:21:48.565002 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564653 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:21:48.565002 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.564735 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:49.564691704 +0000 UTC m=+35.101422401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:21:48.665946 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.665903 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:48.666121 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.665974 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:48.666121 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.666072 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:48.666214 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:48.666150 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:20.666130719 +0000 UTC m=+66.202861412 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 16:21:48.669349 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.669326 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-99cfl\" (UniqueName: \"kubernetes.io/projected/cc090485-3533-4014-9e56-2a28800c3d78-kube-api-access-99cfl\") pod \"network-check-target-l44cq\" (UID: \"cc090485-3533-4014-9e56-2a28800c3d78\") " pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:48.927224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:48.927181 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:49.016293 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.016261 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:49.016508 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.016488 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:21:49.018916 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.018891 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 16:21:49.019297 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.018948 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 16:21:49.019864 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.019825 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dv4l7\"" Apr 22 16:21:49.144007 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.143968 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" event={"ID":"c9eaf512-b7e4-4868-82bc-2b3036304589","Type":"ContainerStarted","Data":"1584879d41911a2efc7400fb1c948827f845202649e21c61354eb01ca0d882ed"} Apr 22 16:21:49.144918 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.144893 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" event={"ID":"7d557b69-3d0f-445b-85b2-5427dc4c6f4a","Type":"ContainerStarted","Data":"d75c1fdbb12c573c88d67b21f3b8e7fc87a4a565d8e6695fdcee23b0e864c3cb"} Apr 22 16:21:49.145855 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.145825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" event={"ID":"ff0b5af2-329b-48fe-9749-12adef9c10ea","Type":"ContainerStarted","Data":"e7e520ab651a96af7e3339a678cede1ab7c37a9841f60e0b191a10a6bfdd872d"} Apr 22 16:21:49.575215 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.575144 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:49.575408 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.575249 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:49.575408 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.575295 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:49.575408 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.575311 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:21:49.575408 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:49.575335 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:49.575408 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.575384 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:21:51.575365831 +0000 UTC m=+37.112096660 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.575940 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.576032 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:51.576017912 +0000 UTC m=+37.112748602 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.576049 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.576070 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.576120 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.576140 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:51.576122773 +0000 UTC m=+37.112853467 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:21:49.578731 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:49.576164 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:51.576153075 +0000 UTC m=+37.112883768 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:21:50.899365 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:50.899190 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-l44cq"] Apr 22 16:21:50.908189 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:50.908144 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc090485_3533_4014_9e56_2a28800c3d78.slice/crio-6c54564936ede54b058853825c8fa15e78620642430c0e9ff79851eefb181a42 WatchSource:0}: Error finding container 6c54564936ede54b058853825c8fa15e78620642430c0e9ff79851eefb181a42: Status 404 returned error can't find the container with id 6c54564936ede54b058853825c8fa15e78620642430c0e9ff79851eefb181a42 Apr 22 16:21:51.151306 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:51.151275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l44cq" event={"ID":"cc090485-3533-4014-9e56-2a28800c3d78","Type":"ContainerStarted","Data":"6c54564936ede54b058853825c8fa15e78620642430c0e9ff79851eefb181a42"} Apr 22 16:21:51.595590 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:51.595552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:51.595857 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.595820 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:21:51.595934 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.595922 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:21:55.595900846 +0000 UTC m=+41.132631540 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:21:51.596383 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:51.596357 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:51.596455 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:51.596427 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:51.596520 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:51.596476 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:51.596520 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596510 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:21:51.596628 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596561 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:55.596546053 +0000 UTC m=+41.133276760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:21:51.596699 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596642 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:21:51.596699 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596654 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:21:51.596699 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596690 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:21:55.596677885 +0000 UTC m=+41.133408578 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:21:51.596699 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596691 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:21:51.596974 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:51.596729 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:21:55.596718178 +0000 UTC m=+41.133448869 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:21:52.158435 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:52.157831 2578 generic.go:358] "Generic (PLEG): container finished" podID="308b982f-cb30-4a62-92f1-e88ca12b210e" containerID="c129302cb9cecb47c83cdb9fa16ca6ed38e7bc556531ddfebe3afeaba582262c" exitCode=0 Apr 22 16:21:52.158435 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:52.157933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerDied","Data":"c129302cb9cecb47c83cdb9fa16ca6ed38e7bc556531ddfebe3afeaba582262c"} Apr 22 16:21:52.709577 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:52.709488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:52.714945 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:52.714911 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/05520239-9b93-4ae7-abd6-fd7042ec092f-original-pull-secret\") pod \"global-pull-secret-syncer-vzf4h\" (UID: \"05520239-9b93-4ae7-abd6-fd7042ec092f\") " pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:52.929909 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:52.929869 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-vzf4h" Apr 22 16:21:53.165206 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:53.165103 2578 generic.go:358] "Generic (PLEG): container finished" podID="308b982f-cb30-4a62-92f1-e88ca12b210e" containerID="059779510a5f69795703b91da1713135e0fe5d358d4fe331860477b343647325" exitCode=0 Apr 22 16:21:53.165206 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:53.165158 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerDied","Data":"059779510a5f69795703b91da1713135e0fe5d358d4fe331860477b343647325"} Apr 22 16:21:55.637003 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:55.636966 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:55.637016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:55.637048 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:55.637073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637120 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637175 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637193 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:03.637173945 +0000 UTC m=+49.173904635 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637194 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637223 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:03.637216186 +0000 UTC m=+49.173946877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637182 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637245 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:22:03.637239576 +0000 UTC m=+49.173970265 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637179 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:21:55.637513 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:21:55.637268 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:22:03.637263876 +0000 UTC m=+49.173994565 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:21:58.603216 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:58.602496 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-vzf4h"] Apr 22 16:21:58.608470 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:21:58.608442 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod05520239_9b93_4ae7_abd6_fd7042ec092f.slice/crio-60caee145b561f7f7a22bacd80bbbfd4ef8c35a429e0c2527af1d68eb45e68be WatchSource:0}: Error finding container 60caee145b561f7f7a22bacd80bbbfd4ef8c35a429e0c2527af1d68eb45e68be: Status 404 returned error can't find the container with id 60caee145b561f7f7a22bacd80bbbfd4ef8c35a429e0c2527af1d68eb45e68be Apr 22 16:21:59.177727 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.177665 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" event={"ID":"7d557b69-3d0f-445b-85b2-5427dc4c6f4a","Type":"ContainerStarted","Data":"6efae753ad1e938e622f1cd6c7feca9a2f5d7b0126e45d10368f1868d154f5c0"} Apr 22 16:21:59.181071 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.181025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" event={"ID":"308b982f-cb30-4a62-92f1-e88ca12b210e","Type":"ContainerStarted","Data":"7a50f62d2113cd4b6cebaa1874db89c4911db403893967bd601ee51fda30ffe6"} Apr 22 16:21:59.182701 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.182544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" event={"ID":"ff0b5af2-329b-48fe-9749-12adef9c10ea","Type":"ContainerStarted","Data":"04c0e1df73658ea1c4dbf9d02fc1c1f517db90eba5190523ad86b29ee0f1d052"} Apr 22 16:21:59.182863 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.182795 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:59.184007 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.183978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" event={"ID":"c9eaf512-b7e4-4868-82bc-2b3036304589","Type":"ContainerStarted","Data":"38aaeb8053d526f25b355aad2a118dadde16b4f42bff5c0b95ccf5ece0a37f07"} Apr 22 16:21:59.184623 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.184603 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:21:59.185613 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.185590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-l44cq" event={"ID":"cc090485-3533-4014-9e56-2a28800c3d78","Type":"ContainerStarted","Data":"48332cf07a0fc59468ab72410fc31fd4901d7a6a1645a646f8bc230d892feede"} Apr 22 16:21:59.185731 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.185717 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:21:59.186672 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.186646 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vzf4h" event={"ID":"05520239-9b93-4ae7-abd6-fd7042ec092f","Type":"ContainerStarted","Data":"60caee145b561f7f7a22bacd80bbbfd4ef8c35a429e0c2527af1d68eb45e68be"} Apr 22 16:21:59.204134 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.204088 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-s5rnp" podStartSLOduration=10.946220863 podStartE2EDuration="44.204076265s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:17.889670611 +0000 UTC m=+3.426401309" lastFinishedPulling="2026-04-22 16:21:51.147526007 +0000 UTC m=+36.684256711" observedRunningTime="2026-04-22 16:21:59.203633114 +0000 UTC m=+44.740363829" watchObservedRunningTime="2026-04-22 16:21:59.204076265 +0000 UTC m=+44.740806974" Apr 22 16:21:59.219158 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.219111 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" podStartSLOduration=17.075013374 podStartE2EDuration="27.219097773s" podCreationTimestamp="2026-04-22 16:21:32 +0000 UTC" firstStartedPulling="2026-04-22 16:21:48.336340158 +0000 UTC m=+33.873070848" lastFinishedPulling="2026-04-22 16:21:58.480424554 +0000 UTC m=+44.017155247" observedRunningTime="2026-04-22 16:21:59.218054201 +0000 UTC m=+44.754784912" watchObservedRunningTime="2026-04-22 16:21:59.219097773 +0000 UTC m=+44.755828503" Apr 22 16:21:59.232727 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:21:59.232666 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-l44cq" podStartSLOduration=36.677544686 podStartE2EDuration="44.232648531s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:21:50.91004703 +0000 UTC m=+36.446777724" lastFinishedPulling="2026-04-22 16:21:58.465150873 +0000 UTC m=+44.001881569" observedRunningTime="2026-04-22 16:21:59.232424364 +0000 UTC m=+44.769155076" watchObservedRunningTime="2026-04-22 16:21:59.232648531 +0000 UTC m=+44.769379244" Apr 22 16:22:02.196863 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:02.196811 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" event={"ID":"7d557b69-3d0f-445b-85b2-5427dc4c6f4a","Type":"ContainerStarted","Data":"48c807eabfe8292315be70294247491e2503b37abed56d60c94ddb529680bdc3"} Apr 22 16:22:02.197334 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:02.196874 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" event={"ID":"7d557b69-3d0f-445b-85b2-5427dc4c6f4a","Type":"ContainerStarted","Data":"ba4efccfa66e3578c42a2dd661ebc0e4e228bcdea633760a1ea7266cf79f7b7e"} Apr 22 16:22:02.215512 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:02.215458 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" podStartSLOduration=17.156681035 podStartE2EDuration="30.215442608s" podCreationTimestamp="2026-04-22 16:21:32 +0000 UTC" firstStartedPulling="2026-04-22 16:21:48.317910506 +0000 UTC m=+33.854641199" lastFinishedPulling="2026-04-22 16:22:01.376672078 +0000 UTC m=+46.913402772" observedRunningTime="2026-04-22 16:22:02.213932305 +0000 UTC m=+47.750663031" watchObservedRunningTime="2026-04-22 16:22:02.215442608 +0000 UTC m=+47.752173319" Apr 22 16:22:02.215776 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:02.215750 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" podStartSLOduration=20.069503704 podStartE2EDuration="30.215739576s" podCreationTimestamp="2026-04-22 16:21:32 +0000 UTC" firstStartedPulling="2026-04-22 16:21:48.316169511 +0000 UTC m=+33.852900200" lastFinishedPulling="2026-04-22 16:21:58.462405376 +0000 UTC m=+43.999136072" observedRunningTime="2026-04-22 16:21:59.246207465 +0000 UTC m=+44.782938178" watchObservedRunningTime="2026-04-22 16:22:02.215739576 +0000 UTC m=+47.752470287" Apr 22 16:22:03.709166 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:03.709123 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:22:03.709624 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:03.709185 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:22:03.709624 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.709356 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:22:03.709624 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.709439 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:19.709414251 +0000 UTC m=+65.246145040 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:22:03.709928 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:03.709904 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:22:03.710213 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.709955 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:03.710303 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.710267 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:22:19.710247659 +0000 UTC m=+65.246978351 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:22:03.710303 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:03.710187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:22:03.710421 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.709995 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:22:03.710421 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.710347 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:22:03.710421 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.710378 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:19.710367582 +0000 UTC m=+65.247098277 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:22:03.710651 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.710636 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:03.710766 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:03.710755 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:22:19.710741191 +0000 UTC m=+65.247471881 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:22:04.202405 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:04.202368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-vzf4h" event={"ID":"05520239-9b93-4ae7-abd6-fd7042ec092f","Type":"ContainerStarted","Data":"46d217a5a7dff76ea12203407bd3cd32d7b77f09ba794708e9290525980413d1"} Apr 22 16:22:04.217722 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:04.217671 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-vzf4h" podStartSLOduration=23.315438942 podStartE2EDuration="28.217657338s" podCreationTimestamp="2026-04-22 16:21:36 +0000 UTC" firstStartedPulling="2026-04-22 16:21:58.610875353 +0000 UTC m=+44.147606066" lastFinishedPulling="2026-04-22 16:22:03.513093767 +0000 UTC m=+49.049824462" observedRunningTime="2026-04-22 16:22:04.216972948 +0000 UTC m=+49.753703660" watchObservedRunningTime="2026-04-22 16:22:04.217657338 +0000 UTC m=+49.754388049" Apr 22 16:22:14.142830 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:14.142800 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-glprs" Apr 22 16:22:19.747781 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:19.747743 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:22:19.747781 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:19.747786 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:19.747813 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.747909 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.747913 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.747933 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:19.747933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.747955 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:22:51.747941667 +0000 UTC m=+97.284672357 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.747969 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:51.747963266 +0000 UTC m=+97.284693956 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.748022 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.748034 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.748074 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:22:51.74805853 +0000 UTC m=+97.284789238 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:22:19.748302 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:19.748095 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:22:51.748084581 +0000 UTC m=+97.284815277 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:22:20.755189 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:20.755153 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:22:20.757640 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:20.757623 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 16:22:20.765389 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:20.765369 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:22:20.765457 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:20.765434 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:24.765412782 +0000 UTC m=+130.302143473 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : secret "metrics-daemon-secret" not found Apr 22 16:22:30.191641 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:30.191604 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-l44cq" Apr 22 16:22:51.792678 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:51.792516 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:22:51.792678 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:51.792583 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:22:51.792678 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:51.792613 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:22:51.792678 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:22:51.792645 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792682 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792729 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792730 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792761 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792792 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792766 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:23:55.792743896 +0000 UTC m=+161.329474588 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792822 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:23:55.792811338 +0000 UTC m=+161.329542029 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792832 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:55.792826262 +0000 UTC m=+161.329556951 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:22:51.793402 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:22:51.792893 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:23:55.792836455 +0000 UTC m=+161.329567145 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:23:24.857121 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:24.857068 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:23:24.857622 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:24.857214 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 16:23:24.857622 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:24.857282 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs podName:01d73bcf-a30e-4dfb-ab2d-863123f999c7 nodeName:}" failed. No retries permitted until 2026-04-22 16:25:26.857264499 +0000 UTC m=+252.393995189 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs") pod "network-metrics-daemon-zzttm" (UID: "01d73bcf-a30e-4dfb-ab2d-863123f999c7") : secret "metrics-daemon-secret" not found Apr 22 16:23:45.937825 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:45.937795 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dl86z_b918f41f-7884-40cb-ac36-9c716d27d92f/dns-node-resolver/0.log" Apr 22 16:23:46.938113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:46.938083 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzk9d_3e6d4c51-d843-4eca-9406-7639d52380a0/node-ca/0.log" Apr 22 16:23:50.840902 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:50.840827 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" podUID="de2e09dd-d655-4750-a773-af55bcb94210" Apr 22 16:23:50.856099 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:50.856070 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" podUID="c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" Apr 22 16:23:50.882447 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:50.882412 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2fjmx" podUID="b427d2da-7345-4266-8029-a5e4953ca8db" Apr 22 16:23:50.889546 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:50.889523 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wt7t4" podUID="a17fc401-3295-45ad-8e4e-b5c7a99047fd" Apr 22 16:23:51.448325 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:51.448290 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:23:51.448502 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:51.448337 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:23:51.448556 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:51.448540 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:23:51.448690 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:51.448679 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fjmx" Apr 22 16:23:52.034600 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:52.034563 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-zzttm" podUID="01d73bcf-a30e-4dfb-ab2d-863123f999c7" Apr 22 16:23:55.888138 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:55.888083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") pod \"image-registry-7bcbd5754d-txtx9\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:23:55.888138 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:55.888147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:55.888171 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:55.888204 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888255 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888281 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-7bcbd5754d-txtx9: secret "image-registry-tls" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888295 2578 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888306 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888296 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888347 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls podName:c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3 nodeName:}" failed. No retries permitted until 2026-04-22 16:25:57.888325669 +0000 UTC m=+283.425056390 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls") pod "image-registry-7bcbd5754d-txtx9" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3") : secret "image-registry-tls" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888383 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert podName:de2e09dd-d655-4750-a773-af55bcb94210 nodeName:}" failed. No retries permitted until 2026-04-22 16:25:57.88837139 +0000 UTC m=+283.425102079 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-6mkdh" (UID: "de2e09dd-d655-4750-a773-af55bcb94210") : secret "networking-console-plugin-cert" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888393 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert podName:a17fc401-3295-45ad-8e4e-b5c7a99047fd nodeName:}" failed. No retries permitted until 2026-04-22 16:25:57.888387276 +0000 UTC m=+283.425117966 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert") pod "ingress-canary-wt7t4" (UID: "a17fc401-3295-45ad-8e4e-b5c7a99047fd") : secret "canary-serving-cert" not found Apr 22 16:23:55.888833 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:23:55.888406 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls podName:b427d2da-7345-4266-8029-a5e4953ca8db nodeName:}" failed. No retries permitted until 2026-04-22 16:25:57.888397814 +0000 UTC m=+283.425128503 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls") pod "dns-default-2fjmx" (UID: "b427d2da-7345-4266-8029-a5e4953ca8db") : secret "dns-default-metrics-tls" not found Apr 22 16:23:59.183432 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.183362 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" podUID="ff0b5af2-329b-48fe-9749-12adef9c10ea" containerName="acm-agent" probeResult="failure" output="Get \"http://10.132.0.10:8000/readyz\": dial tcp 10.132.0.10:8000: connect: connection refused" Apr 22 16:23:59.467292 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.467206 2578 generic.go:358] "Generic (PLEG): container finished" podID="ff0b5af2-329b-48fe-9749-12adef9c10ea" containerID="04c0e1df73658ea1c4dbf9d02fc1c1f517db90eba5190523ad86b29ee0f1d052" exitCode=1 Apr 22 16:23:59.467292 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.467266 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" event={"ID":"ff0b5af2-329b-48fe-9749-12adef9c10ea","Type":"ContainerDied","Data":"04c0e1df73658ea1c4dbf9d02fc1c1f517db90eba5190523ad86b29ee0f1d052"} Apr 22 16:23:59.467641 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.467620 2578 scope.go:117] "RemoveContainer" containerID="04c0e1df73658ea1c4dbf9d02fc1c1f517db90eba5190523ad86b29ee0f1d052" Apr 22 16:23:59.468577 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.468551 2578 generic.go:358] "Generic (PLEG): container finished" podID="c9eaf512-b7e4-4868-82bc-2b3036304589" containerID="38aaeb8053d526f25b355aad2a118dadde16b4f42bff5c0b95ccf5ece0a37f07" exitCode=255 Apr 22 16:23:59.468653 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.468626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" event={"ID":"c9eaf512-b7e4-4868-82bc-2b3036304589","Type":"ContainerDied","Data":"38aaeb8053d526f25b355aad2a118dadde16b4f42bff5c0b95ccf5ece0a37f07"} Apr 22 16:23:59.468985 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:23:59.468949 2578 scope.go:117] "RemoveContainer" containerID="38aaeb8053d526f25b355aad2a118dadde16b4f42bff5c0b95ccf5ece0a37f07" Apr 22 16:24:00.472497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:00.472460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" event={"ID":"ff0b5af2-329b-48fe-9749-12adef9c10ea","Type":"ContainerStarted","Data":"ab15983f4abd3af321361d895546a17600cdbca1060d9540044f8399b43ef3c1"} Apr 22 16:24:00.472921 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:00.472759 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:24:00.473441 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:00.473420 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f66c9b7f6-rpfq2" Apr 22 16:24:00.474190 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:00.474172 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-78697c9f4f-82zjx" event={"ID":"c9eaf512-b7e4-4868-82bc-2b3036304589","Type":"ContainerStarted","Data":"7b7e6b9700a5e0dab93a3700066c10e1a067a1d7b957d7cd6006410af54aa0d9"} Apr 22 16:24:03.016366 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:03.016328 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:24:16.056004 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.055965 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-hjb78"] Apr 22 16:24:16.059207 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.059183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.061687 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.061664 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 16:24:16.063305 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.063287 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 16:24:16.063534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.063515 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-llksk\"" Apr 22 16:24:16.063588 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.063535 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 16:24:16.063654 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.063636 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 16:24:16.075929 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.075900 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hjb78"] Apr 22 16:24:16.146122 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.146084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a657956e-d988-4229-8de7-4484bebd1818-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.146122 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.146121 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vgtb\" (UniqueName: \"kubernetes.io/projected/a657956e-d988-4229-8de7-4484bebd1818-kube-api-access-4vgtb\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.146349 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.146146 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a657956e-d988-4229-8de7-4484bebd1818-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.146349 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.146203 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a657956e-d988-4229-8de7-4484bebd1818-data-volume\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.146349 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.146241 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a657956e-d988-4229-8de7-4484bebd1818-crio-socket\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247528 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a657956e-d988-4229-8de7-4484bebd1818-data-volume\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247699 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247544 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a657956e-d988-4229-8de7-4484bebd1818-crio-socket\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247699 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a657956e-d988-4229-8de7-4484bebd1818-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247699 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247641 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vgtb\" (UniqueName: \"kubernetes.io/projected/a657956e-d988-4229-8de7-4484bebd1818-kube-api-access-4vgtb\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247699 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247662 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a657956e-d988-4229-8de7-4484bebd1818-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247699 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247678 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a657956e-d988-4229-8de7-4484bebd1818-crio-socket\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.247912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.247888 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a657956e-d988-4229-8de7-4484bebd1818-data-volume\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.248150 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.248132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a657956e-d988-4229-8de7-4484bebd1818-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.249998 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.249980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a657956e-d988-4229-8de7-4484bebd1818-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.266860 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.266829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vgtb\" (UniqueName: \"kubernetes.io/projected/a657956e-d988-4229-8de7-4484bebd1818-kube-api-access-4vgtb\") pod \"insights-runtime-extractor-hjb78\" (UID: \"a657956e-d988-4229-8de7-4484bebd1818\") " pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.368375 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.368344 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-hjb78" Apr 22 16:24:16.482794 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.482762 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-hjb78"] Apr 22 16:24:16.485786 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:24:16.485744 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda657956e_d988_4229_8de7_4484bebd1818.slice/crio-a51b079144e048af6bd91d2104e414e9fdf9e5521e03e2f272db56ef57853e8c WatchSource:0}: Error finding container a51b079144e048af6bd91d2104e414e9fdf9e5521e03e2f272db56ef57853e8c: Status 404 returned error can't find the container with id a51b079144e048af6bd91d2104e414e9fdf9e5521e03e2f272db56ef57853e8c Apr 22 16:24:16.514402 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:16.514373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hjb78" event={"ID":"a657956e-d988-4229-8de7-4484bebd1818","Type":"ContainerStarted","Data":"a51b079144e048af6bd91d2104e414e9fdf9e5521e03e2f272db56ef57853e8c"} Apr 22 16:24:17.518326 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:17.518289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hjb78" event={"ID":"a657956e-d988-4229-8de7-4484bebd1818","Type":"ContainerStarted","Data":"2cc49d30f19f43998e9cb36bfe6b5e0f584f7bef59f6e1b8fcb1eff14cc0ade9"} Apr 22 16:24:17.518326 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:17.518328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hjb78" event={"ID":"a657956e-d988-4229-8de7-4484bebd1818","Type":"ContainerStarted","Data":"ac117a5f7eeaf2fa02a5ca4fe886da8acfd3fe2ab699690a24e4b4e7d31862f6"} Apr 22 16:24:19.525122 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:19.525028 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-hjb78" event={"ID":"a657956e-d988-4229-8de7-4484bebd1818","Type":"ContainerStarted","Data":"22ca1011f4fb382140b2ae5c904e4161205724bfa18c129f8e2dbc7fb6bece68"} Apr 22 16:24:19.543461 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:19.543404 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-hjb78" podStartSLOduration=0.9900837 podStartE2EDuration="3.543390235s" podCreationTimestamp="2026-04-22 16:24:16 +0000 UTC" firstStartedPulling="2026-04-22 16:24:16.548596903 +0000 UTC m=+182.085327597" lastFinishedPulling="2026-04-22 16:24:19.101903442 +0000 UTC m=+184.638634132" observedRunningTime="2026-04-22 16:24:19.542222244 +0000 UTC m=+185.078952966" watchObservedRunningTime="2026-04-22 16:24:19.543390235 +0000 UTC m=+185.080120947" Apr 22 16:24:25.493280 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.493244 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-c6xcj"] Apr 22 16:24:25.497325 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.497301 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.501179 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.501141 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 16:24:25.501287 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.501152 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 16:24:25.501368 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.501349 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 16:24:25.501496 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.501481 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 16:24:25.502154 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.502127 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-q77pt\"" Apr 22 16:24:25.502244 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.502187 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 16:24:25.502244 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.502191 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 16:24:25.628926 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.628888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.628926 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.628934 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfd67519-226c-41ad-a582-51e6b68d30cc-metrics-client-ca\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629156 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.628967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7q4v\" (UniqueName: \"kubernetes.io/projected/cfd67519-226c-41ad-a582-51e6b68d30cc-kube-api-access-w7q4v\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629156 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.629020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629156 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.629095 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-wtmp\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629156 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.629129 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-root\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.629158 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-sys\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.629194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-tls\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.629289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.629234 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-textfile\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730319 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730276 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-tls\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730333 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-textfile\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730385 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfd67519-226c-41ad-a582-51e6b68d30cc-metrics-client-ca\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730410 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7q4v\" (UniqueName: \"kubernetes.io/projected/cfd67519-226c-41ad-a582-51e6b68d30cc-kube-api-access-w7q4v\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730436 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730492 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-wtmp\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730769 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-root\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730769 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-sys\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730769 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730591 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-sys\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730769 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730634 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-root\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.730769 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-wtmp\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.731140 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.730785 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-textfile\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.731140 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.731078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/cfd67519-226c-41ad-a582-51e6b68d30cc-metrics-client-ca\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.731210 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.731149 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-accelerators-collector-config\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.732733 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.732710 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-tls\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.733015 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.732997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/cfd67519-226c-41ad-a582-51e6b68d30cc-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.738742 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.738721 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7q4v\" (UniqueName: \"kubernetes.io/projected/cfd67519-226c-41ad-a582-51e6b68d30cc-kube-api-access-w7q4v\") pod \"node-exporter-c6xcj\" (UID: \"cfd67519-226c-41ad-a582-51e6b68d30cc\") " pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.806596 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:25.806526 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c6xcj" Apr 22 16:24:25.814649 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:24:25.814612 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfd67519_226c_41ad_a582_51e6b68d30cc.slice/crio-7051c6a6e3d375995fd8adefea3eb20b054bdaf130f8b30b0da93321191acc60 WatchSource:0}: Error finding container 7051c6a6e3d375995fd8adefea3eb20b054bdaf130f8b30b0da93321191acc60: Status 404 returned error can't find the container with id 7051c6a6e3d375995fd8adefea3eb20b054bdaf130f8b30b0da93321191acc60 Apr 22 16:24:26.543446 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:26.543406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c6xcj" event={"ID":"cfd67519-226c-41ad-a582-51e6b68d30cc","Type":"ContainerStarted","Data":"7051c6a6e3d375995fd8adefea3eb20b054bdaf130f8b30b0da93321191acc60"} Apr 22 16:24:27.546815 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:27.546782 2578 generic.go:358] "Generic (PLEG): container finished" podID="cfd67519-226c-41ad-a582-51e6b68d30cc" containerID="60a3e06bd67e2677b053b34de060b9291e687d05632bf02d6f249e115200aef1" exitCode=0 Apr 22 16:24:27.547275 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:27.546830 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c6xcj" event={"ID":"cfd67519-226c-41ad-a582-51e6b68d30cc","Type":"ContainerDied","Data":"60a3e06bd67e2677b053b34de060b9291e687d05632bf02d6f249e115200aef1"} Apr 22 16:24:28.551454 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:28.551418 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c6xcj" event={"ID":"cfd67519-226c-41ad-a582-51e6b68d30cc","Type":"ContainerStarted","Data":"c365c3b3e76cc78fca5ddcaa37f58eb990dc3e30f0999fd7bf0018d03c854965"} Apr 22 16:24:28.551454 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:28.551454 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c6xcj" event={"ID":"cfd67519-226c-41ad-a582-51e6b68d30cc","Type":"ContainerStarted","Data":"7d5cdb8e4b168fd36e3aef43ff6d732833352c3b4b0290e0dff1298c882d7015"} Apr 22 16:24:28.571505 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:28.571448 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-c6xcj" podStartSLOduration=2.711602665 podStartE2EDuration="3.571433338s" podCreationTimestamp="2026-04-22 16:24:25 +0000 UTC" firstStartedPulling="2026-04-22 16:24:25.816405765 +0000 UTC m=+191.353136454" lastFinishedPulling="2026-04-22 16:24:26.676236438 +0000 UTC m=+192.212967127" observedRunningTime="2026-04-22 16:24:28.569608444 +0000 UTC m=+194.106339167" watchObservedRunningTime="2026-04-22 16:24:28.571433338 +0000 UTC m=+194.108164049" Apr 22 16:24:38.353275 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.353241 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bcbd5754d-txtx9"] Apr 22 16:24:38.353660 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:24:38.353439 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" podUID="c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" Apr 22 16:24:38.574241 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.574211 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:24:38.578223 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.578205 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:24:38.636027 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.635957 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-trusted-ca\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636027 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.635995 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-installation-pull-secrets\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636027 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636022 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-ca-trust-extracted\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636208 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636057 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7sbgp\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-kube-api-access-7sbgp\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636208 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636085 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-bound-sa-token\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636208 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636134 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-image-registry-private-configuration\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636208 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636175 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-certificates\") pod \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\" (UID: \"c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3\") " Apr 22 16:24:38.636410 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636341 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:24:38.636467 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636436 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:24:38.636608 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636587 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-trusted-ca\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:38.636608 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636609 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-ca-trust-extracted\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:38.636783 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.636631 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:24:38.638399 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.638365 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:24:38.638487 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.638427 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:24:38.638487 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.638467 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:24:38.638559 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.638514 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-kube-api-access-7sbgp" (OuterVolumeSpecName: "kube-api-access-7sbgp") pod "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" (UID: "c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3"). InnerVolumeSpecName "kube-api-access-7sbgp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:24:38.737161 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.737113 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-installation-pull-secrets\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:38.737161 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.737155 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7sbgp\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-kube-api-access-7sbgp\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:38.737161 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.737166 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-bound-sa-token\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:38.737161 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.737176 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-image-registry-private-configuration\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:38.737426 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:38.737187 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-certificates\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:39.578807 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:39.578778 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-7bcbd5754d-txtx9" Apr 22 16:24:39.614448 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:39.614412 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-7bcbd5754d-txtx9"] Apr 22 16:24:39.618472 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:39.618441 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-7bcbd5754d-txtx9"] Apr 22 16:24:39.643803 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:39.643762 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3-registry-tls\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:24:41.019592 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:41.019556 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3" path="/var/lib/kubelet/pods/c2c4a4e0-66e7-4ea8-9d46-3bffd1411cd3/volumes" Apr 22 16:24:48.133096 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:48.133053 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" podUID="7d557b69-3d0f-445b-85b2-5427dc4c6f4a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 16:24:51.914541 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:51.914510 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dl86z_b918f41f-7884-40cb-ac36-9c716d27d92f/dns-node-resolver/0.log" Apr 22 16:24:58.132823 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:24:58.132784 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" podUID="7d557b69-3d0f-445b-85b2-5427dc4c6f4a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 16:25:08.132994 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.132944 2578 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" podUID="7d557b69-3d0f-445b-85b2-5427dc4c6f4a" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 22 16:25:08.133462 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.133033 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" Apr 22 16:25:08.133507 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.133480 2578 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"48c807eabfe8292315be70294247491e2503b37abed56d60c94ddb529680bdc3"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 22 16:25:08.133555 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.133539 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" podUID="7d557b69-3d0f-445b-85b2-5427dc4c6f4a" containerName="service-proxy" containerID="cri-o://48c807eabfe8292315be70294247491e2503b37abed56d60c94ddb529680bdc3" gracePeriod=30 Apr 22 16:25:08.648631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.648599 2578 generic.go:358] "Generic (PLEG): container finished" podID="7d557b69-3d0f-445b-85b2-5427dc4c6f4a" containerID="48c807eabfe8292315be70294247491e2503b37abed56d60c94ddb529680bdc3" exitCode=2 Apr 22 16:25:08.648801 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.648655 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" event={"ID":"7d557b69-3d0f-445b-85b2-5427dc4c6f4a","Type":"ContainerDied","Data":"48c807eabfe8292315be70294247491e2503b37abed56d60c94ddb529680bdc3"} Apr 22 16:25:08.648801 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:08.648680 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-5b5746cdf4-q5j54" event={"ID":"7d557b69-3d0f-445b-85b2-5427dc4c6f4a","Type":"ContainerStarted","Data":"8325ca570a82dc44268da182dd39b2e707a250fc17a1607510cd13f88f5a4779"} Apr 22 16:25:26.920052 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:26.920009 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:25:26.922257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:26.922240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/01d73bcf-a30e-4dfb-ab2d-863123f999c7-metrics-certs\") pod \"network-metrics-daemon-zzttm\" (UID: \"01d73bcf-a30e-4dfb-ab2d-863123f999c7\") " pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:25:27.019831 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:27.019799 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-dv4l7\"" Apr 22 16:25:27.027990 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:27.027970 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-zzttm" Apr 22 16:25:27.145356 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:27.145327 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-zzttm"] Apr 22 16:25:27.148258 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:25:27.148227 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01d73bcf_a30e_4dfb_ab2d_863123f999c7.slice/crio-8c86b98d93a8fefbe18b103abb3a85fae3f0bd0c9950eb21f164df52d5b87889 WatchSource:0}: Error finding container 8c86b98d93a8fefbe18b103abb3a85fae3f0bd0c9950eb21f164df52d5b87889: Status 404 returned error can't find the container with id 8c86b98d93a8fefbe18b103abb3a85fae3f0bd0c9950eb21f164df52d5b87889 Apr 22 16:25:27.697093 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:27.697056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zzttm" event={"ID":"01d73bcf-a30e-4dfb-ab2d-863123f999c7","Type":"ContainerStarted","Data":"8c86b98d93a8fefbe18b103abb3a85fae3f0bd0c9950eb21f164df52d5b87889"} Apr 22 16:25:28.701060 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:28.701025 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zzttm" event={"ID":"01d73bcf-a30e-4dfb-ab2d-863123f999c7","Type":"ContainerStarted","Data":"1aa64c23f70208e069a3c1d54ae248f1c966657528c29da4363228cfbfea0138"} Apr 22 16:25:28.701060 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:28.701061 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-zzttm" event={"ID":"01d73bcf-a30e-4dfb-ab2d-863123f999c7","Type":"ContainerStarted","Data":"f092ea9ff4af13f92f34dfb579edea16adeb2de2d18a415ed4443fb19e9a65a9"} Apr 22 16:25:28.717166 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:28.717118 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-zzttm" podStartSLOduration=252.641437557 podStartE2EDuration="4m13.717103058s" podCreationTimestamp="2026-04-22 16:21:15 +0000 UTC" firstStartedPulling="2026-04-22 16:25:27.150168377 +0000 UTC m=+252.686899067" lastFinishedPulling="2026-04-22 16:25:28.225833876 +0000 UTC m=+253.762564568" observedRunningTime="2026-04-22 16:25:28.715759392 +0000 UTC m=+254.252490104" watchObservedRunningTime="2026-04-22 16:25:28.717103058 +0000 UTC m=+254.253833773" Apr 22 16:25:54.449541 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:25:54.449482 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" podUID="de2e09dd-d655-4750-a773-af55bcb94210" Apr 22 16:25:54.449541 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:25:54.449482 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-2fjmx" podUID="b427d2da-7345-4266-8029-a5e4953ca8db" Apr 22 16:25:54.449541 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:25:54.449482 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-wt7t4" podUID="a17fc401-3295-45ad-8e4e-b5c7a99047fd" Apr 22 16:25:54.772063 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:54.771975 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:25:54.772063 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:54.772010 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:25:54.772237 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:54.772118 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fjmx" Apr 22 16:25:57.940907 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:57.940834 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:25:57.941399 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:57.940926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:25:57.941399 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:57.940975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:25:57.943328 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:57.943304 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/b427d2da-7345-4266-8029-a5e4953ca8db-metrics-tls\") pod \"dns-default-2fjmx\" (UID: \"b427d2da-7345-4266-8029-a5e4953ca8db\") " pod="openshift-dns/dns-default-2fjmx" Apr 22 16:25:57.943328 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:57.943317 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/de2e09dd-d655-4750-a773-af55bcb94210-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-6mkdh\" (UID: \"de2e09dd-d655-4750-a773-af55bcb94210\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:25:57.943499 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:57.943359 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a17fc401-3295-45ad-8e4e-b5c7a99047fd-cert\") pod \"ingress-canary-wt7t4\" (UID: \"a17fc401-3295-45ad-8e4e-b5c7a99047fd\") " pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:25:58.076508 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.076470 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pdvf2\"" Apr 22 16:25:58.076508 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.076494 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-fsrq7\"" Apr 22 16:25:58.076716 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.076481 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xxqn8\"" Apr 22 16:25:58.083534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.083512 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" Apr 22 16:25:58.083534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.083527 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-2fjmx" Apr 22 16:25:58.083625 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.083609 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wt7t4" Apr 22 16:25:58.222680 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.222629 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-2fjmx"] Apr 22 16:25:58.228567 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:25:58.228493 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb427d2da_7345_4266_8029_a5e4953ca8db.slice/crio-e530da76772cad062e67ad4ca88465f8197e1018e1e41a759e80fc6a48e6b836 WatchSource:0}: Error finding container e530da76772cad062e67ad4ca88465f8197e1018e1e41a759e80fc6a48e6b836: Status 404 returned error can't find the container with id e530da76772cad062e67ad4ca88465f8197e1018e1e41a759e80fc6a48e6b836 Apr 22 16:25:58.239989 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.239965 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh"] Apr 22 16:25:58.243005 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:25:58.242981 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2e09dd_d655_4750_a773_af55bcb94210.slice/crio-595ce9df393e7b4fa8f968a60a41e543a2174dd649c94b121471456b26b69596 WatchSource:0}: Error finding container 595ce9df393e7b4fa8f968a60a41e543a2174dd649c94b121471456b26b69596: Status 404 returned error can't find the container with id 595ce9df393e7b4fa8f968a60a41e543a2174dd649c94b121471456b26b69596 Apr 22 16:25:58.254959 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.254933 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wt7t4"] Apr 22 16:25:58.257749 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:25:58.257714 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda17fc401_3295_45ad_8e4e_b5c7a99047fd.slice/crio-4567dd80968ceafd2ca0398d5730e338fd00c3fd2f37eff71e311ad42d57c22b WatchSource:0}: Error finding container 4567dd80968ceafd2ca0398d5730e338fd00c3fd2f37eff71e311ad42d57c22b: Status 404 returned error can't find the container with id 4567dd80968ceafd2ca0398d5730e338fd00c3fd2f37eff71e311ad42d57c22b Apr 22 16:25:58.783315 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.783275 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" event={"ID":"de2e09dd-d655-4750-a773-af55bcb94210","Type":"ContainerStarted","Data":"595ce9df393e7b4fa8f968a60a41e543a2174dd649c94b121471456b26b69596"} Apr 22 16:25:58.784273 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.784238 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wt7t4" event={"ID":"a17fc401-3295-45ad-8e4e-b5c7a99047fd","Type":"ContainerStarted","Data":"4567dd80968ceafd2ca0398d5730e338fd00c3fd2f37eff71e311ad42d57c22b"} Apr 22 16:25:58.785132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:25:58.785108 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2fjmx" event={"ID":"b427d2da-7345-4266-8029-a5e4953ca8db","Type":"ContainerStarted","Data":"e530da76772cad062e67ad4ca88465f8197e1018e1e41a759e80fc6a48e6b836"} Apr 22 16:26:00.794761 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.794723 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wt7t4" event={"ID":"a17fc401-3295-45ad-8e4e-b5c7a99047fd","Type":"ContainerStarted","Data":"ae8c85271ca7d05459ffd0a60c67a6654744eda62ead8b345e7fdac4f0769b0b"} Apr 22 16:26:00.796284 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.796257 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2fjmx" event={"ID":"b427d2da-7345-4266-8029-a5e4953ca8db","Type":"ContainerStarted","Data":"9e66de10505a1fbf6b8a846103dc81ea3209ece8ea998fe755887fa9ac7bd18b"} Apr 22 16:26:00.796406 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.796291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-2fjmx" event={"ID":"b427d2da-7345-4266-8029-a5e4953ca8db","Type":"ContainerStarted","Data":"2a25bb87a03fc7e651c459a40256204b15f56f535a2b32cea901a0f6315e1ed2"} Apr 22 16:26:00.796406 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.796390 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-2fjmx" Apr 22 16:26:00.797527 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.797508 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" event={"ID":"de2e09dd-d655-4750-a773-af55bcb94210","Type":"ContainerStarted","Data":"ee380028c101ecd8aed1becdcd8b48c0e05334eb535cfffd6e6d69e2a203a6d7"} Apr 22 16:26:00.809899 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.809837 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wt7t4" podStartSLOduration=251.668479644 podStartE2EDuration="4m13.809825271s" podCreationTimestamp="2026-04-22 16:21:47 +0000 UTC" firstStartedPulling="2026-04-22 16:25:58.259564812 +0000 UTC m=+283.796295503" lastFinishedPulling="2026-04-22 16:26:00.400910437 +0000 UTC m=+285.937641130" observedRunningTime="2026-04-22 16:26:00.809241091 +0000 UTC m=+286.345971840" watchObservedRunningTime="2026-04-22 16:26:00.809825271 +0000 UTC m=+286.346555987" Apr 22 16:26:00.823621 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.823577 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-6mkdh" podStartSLOduration=265.671927704 podStartE2EDuration="4m27.823563278s" podCreationTimestamp="2026-04-22 16:21:33 +0000 UTC" firstStartedPulling="2026-04-22 16:25:58.244784939 +0000 UTC m=+283.781515631" lastFinishedPulling="2026-04-22 16:26:00.396420515 +0000 UTC m=+285.933151205" observedRunningTime="2026-04-22 16:26:00.822891906 +0000 UTC m=+286.359622619" watchObservedRunningTime="2026-04-22 16:26:00.823563278 +0000 UTC m=+286.360293990" Apr 22 16:26:00.840808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:00.840757 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-2fjmx" podStartSLOduration=251.674073439 podStartE2EDuration="4m13.840740661s" podCreationTimestamp="2026-04-22 16:21:47 +0000 UTC" firstStartedPulling="2026-04-22 16:25:58.230592017 +0000 UTC m=+283.767322707" lastFinishedPulling="2026-04-22 16:26:00.397259234 +0000 UTC m=+285.933989929" observedRunningTime="2026-04-22 16:26:00.839415 +0000 UTC m=+286.376145782" watchObservedRunningTime="2026-04-22 16:26:00.840740661 +0000 UTC m=+286.377471375" Apr 22 16:26:10.802899 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:10.802832 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-2fjmx" Apr 22 16:26:14.910650 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:14.910618 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:26:14.911186 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:14.911166 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:26:14.913307 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:26:14.913282 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 16:31:14.927141 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:31:14.927110 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:31:14.927141 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:31:14.927143 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:36:14.943861 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:36:14.943813 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:36:14.944653 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:36:14.944629 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:41:14.959958 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:41:14.959931 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:41:14.961465 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:41:14.961331 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:46:14.976086 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:46:14.975972 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:46:14.977431 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:46:14.976962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:49:33.050070 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.050037 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2kx9x"] Apr 22 16:49:33.051877 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.051862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.054244 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.054221 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 16:49:33.055257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.055234 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 16:49:33.055366 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.055272 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-xftd9\"" Apr 22 16:49:33.062121 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.062100 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2kx9x"] Apr 22 16:49:33.152983 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.152946 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1-bound-sa-token\") pod \"cert-manager-79c8d999ff-2kx9x\" (UID: \"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1\") " pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.153166 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.152995 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hd8d\" (UniqueName: \"kubernetes.io/projected/c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1-kube-api-access-4hd8d\") pod \"cert-manager-79c8d999ff-2kx9x\" (UID: \"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1\") " pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.254020 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.253963 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4hd8d\" (UniqueName: \"kubernetes.io/projected/c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1-kube-api-access-4hd8d\") pod \"cert-manager-79c8d999ff-2kx9x\" (UID: \"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1\") " pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.254194 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.254062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1-bound-sa-token\") pod \"cert-manager-79c8d999ff-2kx9x\" (UID: \"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1\") " pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.262516 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.262492 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1-bound-sa-token\") pod \"cert-manager-79c8d999ff-2kx9x\" (UID: \"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1\") " pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.262636 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.262624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hd8d\" (UniqueName: \"kubernetes.io/projected/c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1-kube-api-access-4hd8d\") pod \"cert-manager-79c8d999ff-2kx9x\" (UID: \"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1\") " pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.361093 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.361063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-2kx9x" Apr 22 16:49:33.481789 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.481754 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-2kx9x"] Apr 22 16:49:33.484952 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:49:33.484922 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8b4f5cf_e8c5_4c63_b4e9_3103596fb9f1.slice/crio-bec36b47a5b1831044aad5367cf0f0cf819a57cd434b6036622a280b156fe65b WatchSource:0}: Error finding container bec36b47a5b1831044aad5367cf0f0cf819a57cd434b6036622a280b156fe65b: Status 404 returned error can't find the container with id bec36b47a5b1831044aad5367cf0f0cf819a57cd434b6036622a280b156fe65b Apr 22 16:49:33.486785 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:33.486768 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:49:34.358023 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:34.357984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2kx9x" event={"ID":"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1","Type":"ContainerStarted","Data":"bec36b47a5b1831044aad5367cf0f0cf819a57cd434b6036622a280b156fe65b"} Apr 22 16:49:36.365233 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:36.365198 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-2kx9x" event={"ID":"c8b4f5cf-e8c5-4c63-b4e9-3103596fb9f1","Type":"ContainerStarted","Data":"e3f9826d51a3038b55eaad26b5ac3521071910fd5098184def24a38258a6a850"} Apr 22 16:49:36.381252 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:36.381197 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-2kx9x" podStartSLOduration=0.654388947 podStartE2EDuration="3.381183092s" podCreationTimestamp="2026-04-22 16:49:33 +0000 UTC" firstStartedPulling="2026-04-22 16:49:33.486917022 +0000 UTC m=+1699.023647716" lastFinishedPulling="2026-04-22 16:49:36.213711168 +0000 UTC m=+1701.750441861" observedRunningTime="2026-04-22 16:49:36.379721853 +0000 UTC m=+1701.916452576" watchObservedRunningTime="2026-04-22 16:49:36.381183092 +0000 UTC m=+1701.917913803" Apr 22 16:49:46.855075 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.855036 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t"] Apr 22 16:49:46.861592 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.861560 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:46.864245 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.864227 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-webhook-cert\"" Apr 22 16:49:46.864636 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.864616 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"kube-root-ca.crt\"" Apr 22 16:49:46.865077 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.865062 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"openshift-service-ca.crt\"" Apr 22 16:49:46.865283 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.865266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-dockercfg-b4qgt\"" Apr 22 16:49:46.865743 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.865728 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"opendatahub-operator-controller-manager-service-cert\"" Apr 22 16:49:46.871914 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.871895 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t"] Apr 22 16:49:46.953275 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.953237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rszt\" (UniqueName: \"kubernetes.io/projected/9b4f3977-b9fd-4410-bffc-359c193b411d-kube-api-access-9rszt\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:46.953461 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.953297 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b4f3977-b9fd-4410-bffc-359c193b411d-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:46.953461 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:46.953322 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b4f3977-b9fd-4410-bffc-359c193b411d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.054442 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.054401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b4f3977-b9fd-4410-bffc-359c193b411d-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.054442 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.054442 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b4f3977-b9fd-4410-bffc-359c193b411d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.054707 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.054468 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9rszt\" (UniqueName: \"kubernetes.io/projected/9b4f3977-b9fd-4410-bffc-359c193b411d-kube-api-access-9rszt\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.056957 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.056930 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/9b4f3977-b9fd-4410-bffc-359c193b411d-apiservice-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.057054 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.056976 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/9b4f3977-b9fd-4410-bffc-359c193b411d-webhook-cert\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.062987 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.062960 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rszt\" (UniqueName: \"kubernetes.io/projected/9b4f3977-b9fd-4410-bffc-359c193b411d-kube-api-access-9rszt\") pod \"opendatahub-operator-controller-manager-57c8d5d679-tbq2t\" (UID: \"9b4f3977-b9fd-4410-bffc-359c193b411d\") " pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.171994 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.171891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:47.293432 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.293400 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t"] Apr 22 16:49:47.297611 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:49:47.297579 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b4f3977_b9fd_4410_bffc_359c193b411d.slice/crio-a561854d470e214f38594c2dcab5b3fb63d8f355c09af3a079fad560d92bb0e6 WatchSource:0}: Error finding container a561854d470e214f38594c2dcab5b3fb63d8f355c09af3a079fad560d92bb0e6: Status 404 returned error can't find the container with id a561854d470e214f38594c2dcab5b3fb63d8f355c09af3a079fad560d92bb0e6 Apr 22 16:49:47.394458 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:47.394419 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" event={"ID":"9b4f3977-b9fd-4410-bffc-359c193b411d","Type":"ContainerStarted","Data":"a561854d470e214f38594c2dcab5b3fb63d8f355c09af3a079fad560d92bb0e6"} Apr 22 16:49:50.405130 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:50.405096 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" event={"ID":"9b4f3977-b9fd-4410-bffc-359c193b411d","Type":"ContainerStarted","Data":"4887a4312059f654adb599dfe1f1d7a78042799acbd62a2f7f709bcff51a7627"} Apr 22 16:49:50.405547 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:50.405223 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:49:50.426787 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:49:50.426743 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" podStartSLOduration=1.6609189450000001 podStartE2EDuration="4.426726624s" podCreationTimestamp="2026-04-22 16:49:46 +0000 UTC" firstStartedPulling="2026-04-22 16:49:47.299906003 +0000 UTC m=+1712.836636696" lastFinishedPulling="2026-04-22 16:49:50.065713676 +0000 UTC m=+1715.602444375" observedRunningTime="2026-04-22 16:49:50.424622928 +0000 UTC m=+1715.961353679" watchObservedRunningTime="2026-04-22 16:49:50.426726624 +0000 UTC m=+1715.963457336" Apr 22 16:50:01.409590 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:01.409557 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/opendatahub-operator-controller-manager-57c8d5d679-tbq2t" Apr 22 16:50:04.661502 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.661467 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/kube-auth-proxy-644d48748b-22xlm"] Apr 22 16:50:04.664621 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.664604 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.667236 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.667210 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-creds\"" Apr 22 16:50:04.668251 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.668231 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 16:50:04.668385 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.668363 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-tls\"" Apr 22 16:50:04.668454 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.668433 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"kube-auth-proxy-dockercfg-6859h\"" Apr 22 16:50:04.668511 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.668469 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 16:50:04.678495 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.678469 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-644d48748b-22xlm"] Apr 22 16:50:04.792170 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.792118 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d5a411-7bea-4aa0-8513-5f299420fc6a-tmp\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.792358 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.792189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsm8z\" (UniqueName: \"kubernetes.io/projected/a4d5a411-7bea-4aa0-8513-5f299420fc6a-kube-api-access-vsm8z\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.792358 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.792252 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d5a411-7bea-4aa0-8513-5f299420fc6a-tls-certs\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.893091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.893039 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vsm8z\" (UniqueName: \"kubernetes.io/projected/a4d5a411-7bea-4aa0-8513-5f299420fc6a-kube-api-access-vsm8z\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.893091 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.893093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d5a411-7bea-4aa0-8513-5f299420fc6a-tls-certs\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.893312 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.893147 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d5a411-7bea-4aa0-8513-5f299420fc6a-tmp\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.895532 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.895506 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a4d5a411-7bea-4aa0-8513-5f299420fc6a-tmp\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.895642 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.895611 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a4d5a411-7bea-4aa0-8513-5f299420fc6a-tls-certs\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.906696 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.906671 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsm8z\" (UniqueName: \"kubernetes.io/projected/a4d5a411-7bea-4aa0-8513-5f299420fc6a-kube-api-access-vsm8z\") pod \"kube-auth-proxy-644d48748b-22xlm\" (UID: \"a4d5a411-7bea-4aa0-8513-5f299420fc6a\") " pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:04.973934 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:04.973860 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" Apr 22 16:50:05.093170 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:05.093137 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/kube-auth-proxy-644d48748b-22xlm"] Apr 22 16:50:05.096134 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:50:05.096103 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d5a411_7bea_4aa0_8513_5f299420fc6a.slice/crio-5edaf189bbf8c7662c5893854974723ea9e81414196b2ed54a6d5c9d8178cea5 WatchSource:0}: Error finding container 5edaf189bbf8c7662c5893854974723ea9e81414196b2ed54a6d5c9d8178cea5: Status 404 returned error can't find the container with id 5edaf189bbf8c7662c5893854974723ea9e81414196b2ed54a6d5c9d8178cea5 Apr 22 16:50:05.444879 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:05.444823 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" event={"ID":"a4d5a411-7bea-4aa0-8513-5f299420fc6a","Type":"ContainerStarted","Data":"5edaf189bbf8c7662c5893854974723ea9e81414196b2ed54a6d5c9d8178cea5"} Apr 22 16:50:07.855889 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:07.855839 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cl7hc"] Apr 22 16:50:07.858939 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:07.858922 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:07.861331 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:07.861311 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-webhook-cert\"" Apr 22 16:50:07.861466 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:07.861449 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"odh-model-controller-dockercfg-wgf48\"" Apr 22 16:50:07.865226 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:07.865204 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cl7hc"] Apr 22 16:50:08.019623 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.019581 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xlnpz\" (UniqueName: \"kubernetes.io/projected/10306e03-001a-4b79-9b69-6d29e3cca8d3-kube-api-access-xlnpz\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.019623 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.019625 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10306e03-001a-4b79-9b69-6d29e3cca8d3-cert\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.120125 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.120037 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xlnpz\" (UniqueName: \"kubernetes.io/projected/10306e03-001a-4b79-9b69-6d29e3cca8d3-kube-api-access-xlnpz\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.120125 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.120077 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10306e03-001a-4b79-9b69-6d29e3cca8d3-cert\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.120361 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:08.120285 2578 secret.go:189] Couldn't get secret opendatahub/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 22 16:50:08.120361 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:08.120358 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/10306e03-001a-4b79-9b69-6d29e3cca8d3-cert podName:10306e03-001a-4b79-9b69-6d29e3cca8d3 nodeName:}" failed. No retries permitted until 2026-04-22 16:50:08.620338768 +0000 UTC m=+1734.157069481 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/10306e03-001a-4b79-9b69-6d29e3cca8d3-cert") pod "odh-model-controller-858dbf95b8-cl7hc" (UID: "10306e03-001a-4b79-9b69-6d29e3cca8d3") : secret "odh-model-controller-webhook-cert" not found Apr 22 16:50:08.128965 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.128934 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xlnpz\" (UniqueName: \"kubernetes.io/projected/10306e03-001a-4b79-9b69-6d29e3cca8d3-kube-api-access-xlnpz\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.625261 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.625217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10306e03-001a-4b79-9b69-6d29e3cca8d3-cert\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.627531 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.627510 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/10306e03-001a-4b79-9b69-6d29e3cca8d3-cert\") pod \"odh-model-controller-858dbf95b8-cl7hc\" (UID: \"10306e03-001a-4b79-9b69-6d29e3cca8d3\") " pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:08.771094 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:08.771055 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:09.091209 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:09.091176 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/odh-model-controller-858dbf95b8-cl7hc"] Apr 22 16:50:09.094444 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:50:09.094414 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10306e03_001a_4b79_9b69_6d29e3cca8d3.slice/crio-03eb6ba71c2a1006463c4472d8f374ea9dc236c2f2c58361eabec63285c295b6 WatchSource:0}: Error finding container 03eb6ba71c2a1006463c4472d8f374ea9dc236c2f2c58361eabec63285c295b6: Status 404 returned error can't find the container with id 03eb6ba71c2a1006463c4472d8f374ea9dc236c2f2c58361eabec63285c295b6 Apr 22 16:50:09.456459 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:09.456373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" event={"ID":"a4d5a411-7bea-4aa0-8513-5f299420fc6a","Type":"ContainerStarted","Data":"39c177522a6af46c4c69abeb6d581dcf38705f08964b962e5847debf403d072b"} Apr 22 16:50:09.457371 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:09.457348 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" event={"ID":"10306e03-001a-4b79-9b69-6d29e3cca8d3","Type":"ContainerStarted","Data":"03eb6ba71c2a1006463c4472d8f374ea9dc236c2f2c58361eabec63285c295b6"} Apr 22 16:50:09.472312 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:09.472264 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/kube-auth-proxy-644d48748b-22xlm" podStartSLOduration=1.9178052079999999 podStartE2EDuration="5.472249305s" podCreationTimestamp="2026-04-22 16:50:04 +0000 UTC" firstStartedPulling="2026-04-22 16:50:05.097742387 +0000 UTC m=+1730.634473080" lastFinishedPulling="2026-04-22 16:50:08.652186473 +0000 UTC m=+1734.188917177" observedRunningTime="2026-04-22 16:50:09.471062204 +0000 UTC m=+1735.007792917" watchObservedRunningTime="2026-04-22 16:50:09.472249305 +0000 UTC m=+1735.008980017" Apr 22 16:50:12.467475 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:12.467436 2578 generic.go:358] "Generic (PLEG): container finished" podID="10306e03-001a-4b79-9b69-6d29e3cca8d3" containerID="9d3a3f66e98f93dd9eafa36ba62f5f8bdff3f2f2ebf18e391d9e69898b0ea733" exitCode=1 Apr 22 16:50:12.467863 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:12.467528 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" event={"ID":"10306e03-001a-4b79-9b69-6d29e3cca8d3","Type":"ContainerDied","Data":"9d3a3f66e98f93dd9eafa36ba62f5f8bdff3f2f2ebf18e391d9e69898b0ea733"} Apr 22 16:50:12.467863 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:12.467716 2578 scope.go:117] "RemoveContainer" containerID="9d3a3f66e98f93dd9eafa36ba62f5f8bdff3f2f2ebf18e391d9e69898b0ea733" Apr 22 16:50:13.471885 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.471829 2578 generic.go:358] "Generic (PLEG): container finished" podID="10306e03-001a-4b79-9b69-6d29e3cca8d3" containerID="36c0304ecf3501f320a7fee14163e013ce2f79c5ae3cb13de161f291997d170f" exitCode=1 Apr 22 16:50:13.472333 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.471919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" event={"ID":"10306e03-001a-4b79-9b69-6d29e3cca8d3","Type":"ContainerDied","Data":"36c0304ecf3501f320a7fee14163e013ce2f79c5ae3cb13de161f291997d170f"} Apr 22 16:50:13.472333 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.471965 2578 scope.go:117] "RemoveContainer" containerID="9d3a3f66e98f93dd9eafa36ba62f5f8bdff3f2f2ebf18e391d9e69898b0ea733" Apr 22 16:50:13.472333 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.472139 2578 scope.go:117] "RemoveContainer" containerID="36c0304ecf3501f320a7fee14163e013ce2f79c5ae3cb13de161f291997d170f" Apr 22 16:50:13.472333 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:13.472324 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cl7hc_opendatahub(10306e03-001a-4b79-9b69-6d29e3cca8d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" podUID="10306e03-001a-4b79-9b69-6d29e3cca8d3" Apr 22 16:50:13.552461 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.552426 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-n77l2"] Apr 22 16:50:13.556718 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.556701 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:13.559096 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.559073 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-webhook-server-cert\"" Apr 22 16:50:13.559247 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.559145 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"kserve-controller-manager-dockercfg-gmz2h\"" Apr 22 16:50:13.564479 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.564458 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-n77l2"] Apr 22 16:50:13.664419 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.664380 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82494b1-74f4-4de6-92d9-33d72db1cb2c-cert\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:13.664564 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.664429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zfx2\" (UniqueName: \"kubernetes.io/projected/f82494b1-74f4-4de6-92d9-33d72db1cb2c-kube-api-access-2zfx2\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:13.765052 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.764953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82494b1-74f4-4de6-92d9-33d72db1cb2c-cert\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:13.765052 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.765007 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2zfx2\" (UniqueName: \"kubernetes.io/projected/f82494b1-74f4-4de6-92d9-33d72db1cb2c-kube-api-access-2zfx2\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:13.765247 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:13.765106 2578 secret.go:189] Couldn't get secret opendatahub/kserve-webhook-server-cert: secret "kserve-webhook-server-cert" not found Apr 22 16:50:13.765247 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:13.765173 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f82494b1-74f4-4de6-92d9-33d72db1cb2c-cert podName:f82494b1-74f4-4de6-92d9-33d72db1cb2c nodeName:}" failed. No retries permitted until 2026-04-22 16:50:14.265156971 +0000 UTC m=+1739.801887661 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f82494b1-74f4-4de6-92d9-33d72db1cb2c-cert") pod "kserve-controller-manager-856948b99f-n77l2" (UID: "f82494b1-74f4-4de6-92d9-33d72db1cb2c") : secret "kserve-webhook-server-cert" not found Apr 22 16:50:13.778434 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:13.778404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zfx2\" (UniqueName: \"kubernetes.io/projected/f82494b1-74f4-4de6-92d9-33d72db1cb2c-kube-api-access-2zfx2\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:14.267868 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:14.267790 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82494b1-74f4-4de6-92d9-33d72db1cb2c-cert\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:14.270175 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:14.270150 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f82494b1-74f4-4de6-92d9-33d72db1cb2c-cert\") pod \"kserve-controller-manager-856948b99f-n77l2\" (UID: \"f82494b1-74f4-4de6-92d9-33d72db1cb2c\") " pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:14.467614 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:14.467577 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:14.477075 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:14.477052 2578 scope.go:117] "RemoveContainer" containerID="36c0304ecf3501f320a7fee14163e013ce2f79c5ae3cb13de161f291997d170f" Apr 22 16:50:14.477366 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:14.477225 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cl7hc_opendatahub(10306e03-001a-4b79-9b69-6d29e3cca8d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" podUID="10306e03-001a-4b79-9b69-6d29e3cca8d3" Apr 22 16:50:14.590451 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:14.590419 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/kserve-controller-manager-856948b99f-n77l2"] Apr 22 16:50:14.593642 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:50:14.593612 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf82494b1_74f4_4de6_92d9_33d72db1cb2c.slice/crio-a312203174609d7c6383f18a5a84b3e766958a32fe145009a453dcf468024515 WatchSource:0}: Error finding container a312203174609d7c6383f18a5a84b3e766958a32fe145009a453dcf468024515: Status 404 returned error can't find the container with id a312203174609d7c6383f18a5a84b3e766958a32fe145009a453dcf468024515 Apr 22 16:50:15.480548 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:15.480511 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" event={"ID":"f82494b1-74f4-4de6-92d9-33d72db1cb2c","Type":"ContainerStarted","Data":"a312203174609d7c6383f18a5a84b3e766958a32fe145009a453dcf468024515"} Apr 22 16:50:18.490113 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:18.490082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" event={"ID":"f82494b1-74f4-4de6-92d9-33d72db1cb2c","Type":"ContainerStarted","Data":"56487c832ec6a1ec8b6ddc80d75b8746fe2d816aa87e7a7f0222661524fb781c"} Apr 22 16:50:18.490602 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:18.490191 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:50:18.771765 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:18.771677 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:18.772136 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:18.772119 2578 scope.go:117] "RemoveContainer" containerID="36c0304ecf3501f320a7fee14163e013ce2f79c5ae3cb13de161f291997d170f" Apr 22 16:50:18.772341 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:50:18.772322 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with CrashLoopBackOff: \"back-off 10s restarting failed container=manager pod=odh-model-controller-858dbf95b8-cl7hc_opendatahub(10306e03-001a-4b79-9b69-6d29e3cca8d3)\"" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" podUID="10306e03-001a-4b79-9b69-6d29e3cca8d3" Apr 22 16:50:19.859334 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.859285 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" podStartSLOduration=3.772230621 podStartE2EDuration="6.859270961s" podCreationTimestamp="2026-04-22 16:50:13 +0000 UTC" firstStartedPulling="2026-04-22 16:50:14.595644492 +0000 UTC m=+1740.132375181" lastFinishedPulling="2026-04-22 16:50:17.682684819 +0000 UTC m=+1743.219415521" observedRunningTime="2026-04-22 16:50:18.549228729 +0000 UTC m=+1744.085959442" watchObservedRunningTime="2026-04-22 16:50:19.859270961 +0000 UTC m=+1745.396001673" Apr 22 16:50:19.860019 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.859991 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z"] Apr 22 16:50:19.863232 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.863217 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:19.866048 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.866020 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 22 16:50:19.866436 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.866418 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 22 16:50:19.866574 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.866549 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-fpz79\"" Apr 22 16:50:19.877594 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:19.877566 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z"] Apr 22 16:50:20.017274 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.017237 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/583051fe-e046-4f75-85d4-f8647b7c3d04-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g5r7z\" (UID: \"583051fe-e046-4f75-85d4-f8647b7c3d04\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.017458 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.017285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wmf76\" (UniqueName: \"kubernetes.io/projected/583051fe-e046-4f75-85d4-f8647b7c3d04-kube-api-access-wmf76\") pod \"servicemesh-operator3-55f49c5f94-g5r7z\" (UID: \"583051fe-e046-4f75-85d4-f8647b7c3d04\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.118329 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.118238 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wmf76\" (UniqueName: \"kubernetes.io/projected/583051fe-e046-4f75-85d4-f8647b7c3d04-kube-api-access-wmf76\") pod \"servicemesh-operator3-55f49c5f94-g5r7z\" (UID: \"583051fe-e046-4f75-85d4-f8647b7c3d04\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.118329 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.118318 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/583051fe-e046-4f75-85d4-f8647b7c3d04-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g5r7z\" (UID: \"583051fe-e046-4f75-85d4-f8647b7c3d04\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.120784 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.120763 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/583051fe-e046-4f75-85d4-f8647b7c3d04-operator-config\") pod \"servicemesh-operator3-55f49c5f94-g5r7z\" (UID: \"583051fe-e046-4f75-85d4-f8647b7c3d04\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.126648 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.126622 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wmf76\" (UniqueName: \"kubernetes.io/projected/583051fe-e046-4f75-85d4-f8647b7c3d04-kube-api-access-wmf76\") pod \"servicemesh-operator3-55f49c5f94-g5r7z\" (UID: \"583051fe-e046-4f75-85d4-f8647b7c3d04\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.172340 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.172308 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:20.291635 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.291600 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z"] Apr 22 16:50:20.296719 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:50:20.296690 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod583051fe_e046_4f75_85d4_f8647b7c3d04.slice/crio-b2f9b554aada2ae2b4b26171983c770de99e5ee6cc82d8b4b9fd3119f8325852 WatchSource:0}: Error finding container b2f9b554aada2ae2b4b26171983c770de99e5ee6cc82d8b4b9fd3119f8325852: Status 404 returned error can't find the container with id b2f9b554aada2ae2b4b26171983c770de99e5ee6cc82d8b4b9fd3119f8325852 Apr 22 16:50:20.496719 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:20.496635 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" event={"ID":"583051fe-e046-4f75-85d4-f8647b7c3d04","Type":"ContainerStarted","Data":"b2f9b554aada2ae2b4b26171983c770de99e5ee6cc82d8b4b9fd3119f8325852"} Apr 22 16:50:28.771393 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:28.771353 2578 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:28.771770 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:28.771731 2578 scope.go:117] "RemoveContainer" containerID="36c0304ecf3501f320a7fee14163e013ce2f79c5ae3cb13de161f291997d170f" Apr 22 16:50:29.526390 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:29.526354 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" event={"ID":"10306e03-001a-4b79-9b69-6d29e3cca8d3","Type":"ContainerStarted","Data":"c697f9a98b90e6aa38fc2f2520679e7335a0991d027f9c69b995af7d2020f175"} Apr 22 16:50:29.526609 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:29.526589 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:29.528107 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:29.528077 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" event={"ID":"583051fe-e046-4f75-85d4-f8647b7c3d04","Type":"ContainerStarted","Data":"092c0cec19b2d4698c64ea39698782f187bade9d110ea3cee54055003a74cd51"} Apr 22 16:50:29.528221 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:29.528126 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:29.543019 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:29.542963 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" podStartSLOduration=2.585222203 podStartE2EDuration="22.542944206s" podCreationTimestamp="2026-04-22 16:50:07 +0000 UTC" firstStartedPulling="2026-04-22 16:50:09.09572011 +0000 UTC m=+1734.632450804" lastFinishedPulling="2026-04-22 16:50:29.053442116 +0000 UTC m=+1754.590172807" observedRunningTime="2026-04-22 16:50:29.54215181 +0000 UTC m=+1755.078882523" watchObservedRunningTime="2026-04-22 16:50:29.542944206 +0000 UTC m=+1755.079674921" Apr 22 16:50:29.563154 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:29.563089 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" podStartSLOduration=1.998063393 podStartE2EDuration="10.563069805s" podCreationTimestamp="2026-04-22 16:50:19 +0000 UTC" firstStartedPulling="2026-04-22 16:50:20.299326852 +0000 UTC m=+1745.836057556" lastFinishedPulling="2026-04-22 16:50:28.864333278 +0000 UTC m=+1754.401063968" observedRunningTime="2026-04-22 16:50:29.561909717 +0000 UTC m=+1755.098640431" watchObservedRunningTime="2026-04-22 16:50:29.563069805 +0000 UTC m=+1755.099800512" Apr 22 16:50:30.066512 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.066476 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6"] Apr 22 16:50:30.069717 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.069694 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.072317 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.072116 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 22 16:50:30.072317 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.072138 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 22 16:50:30.072317 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.072179 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 22 16:50:30.072317 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.072142 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-825qn\"" Apr 22 16:50:30.072317 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.072181 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 22 16:50:30.083494 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083466 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6"] Apr 22 16:50:30.083700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.083771 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083722 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.083771 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083744 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.083880 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.083919 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083888 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.083966 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xk5c\" (UniqueName: \"kubernetes.io/projected/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-kube-api-access-9xk5c\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.084017 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.083996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184723 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184681 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184865 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184888 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.184943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.184930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9xk5c\" (UniqueName: \"kubernetes.io/projected/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-kube-api-access-9xk5c\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.185668 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.185640 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.187414 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.187389 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.187550 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.187530 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.187708 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.187660 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.187708 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.187693 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.192546 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.192526 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.193045 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.193014 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xk5c\" (UniqueName: \"kubernetes.io/projected/91629e04-1b11-48e1-bd19-5f3bcc2d3cc7-kube-api-access-9xk5c\") pod \"istiod-openshift-gateway-55ff986f96-krwf6\" (UID: \"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.381135 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.381096 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:30.516774 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.516728 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6"] Apr 22 16:50:30.520032 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:50:30.519999 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91629e04_1b11_48e1_bd19_5f3bcc2d3cc7.slice/crio-a53258e9655c1b2a7e5e902dc59f8eeb497c36ef4429fbc1641073b3a3fe1cb4 WatchSource:0}: Error finding container a53258e9655c1b2a7e5e902dc59f8eeb497c36ef4429fbc1641073b3a3fe1cb4: Status 404 returned error can't find the container with id a53258e9655c1b2a7e5e902dc59f8eeb497c36ef4429fbc1641073b3a3fe1cb4 Apr 22 16:50:30.532133 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:30.532100 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" event={"ID":"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7","Type":"ContainerStarted","Data":"a53258e9655c1b2a7e5e902dc59f8eeb497c36ef4429fbc1641073b3a3fe1cb4"} Apr 22 16:50:33.183983 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:33.183947 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236212Ki","pods":"250"} Apr 22 16:50:33.184260 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:33.184017 2578 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236212Ki","pods":"250"} Apr 22 16:50:33.543991 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:33.543900 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" event={"ID":"91629e04-1b11-48e1-bd19-5f3bcc2d3cc7","Type":"ContainerStarted","Data":"15c9270684a4f6e492a2260ba8a9fdfe8d5dde9a322a6a311c0a06a6b16cb643"} Apr 22 16:50:33.544144 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:33.543999 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:33.566512 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:33.566448 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" podStartSLOduration=0.905115764 podStartE2EDuration="3.566430183s" podCreationTimestamp="2026-04-22 16:50:30 +0000 UTC" firstStartedPulling="2026-04-22 16:50:30.522346742 +0000 UTC m=+1756.059077432" lastFinishedPulling="2026-04-22 16:50:33.18366115 +0000 UTC m=+1758.720391851" observedRunningTime="2026-04-22 16:50:33.564726693 +0000 UTC m=+1759.101457442" watchObservedRunningTime="2026-04-22 16:50:33.566430183 +0000 UTC m=+1759.103160896" Apr 22 16:50:34.549586 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:34.549553 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-krwf6" Apr 22 16:50:40.534404 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:40.534371 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-g5r7z" Apr 22 16:50:40.534814 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:40.534435 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/odh-model-controller-858dbf95b8-cl7hc" Apr 22 16:50:49.498198 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:50:49.498167 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/kserve-controller-manager-856948b99f-n77l2" Apr 22 16:51:14.994347 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:14.994224 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:51:15.002064 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:14.996125 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:51:41.155174 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.155141 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-68hkw"] Apr 22 16:51:41.158282 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.158265 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:51:41.161859 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.161826 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 16:51:41.162000 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.161861 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-qjx8r\"" Apr 22 16:51:41.162465 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.162381 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 16:51:41.176501 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.176478 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-68hkw"] Apr 22 16:51:41.238062 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.238023 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p85jd\" (UniqueName: \"kubernetes.io/projected/726ec92c-3d20-4f47-b117-123c1c25d656-kube-api-access-p85jd\") pod \"authorino-operator-657f44b778-68hkw\" (UID: \"726ec92c-3d20-4f47-b117-123c1c25d656\") " pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:51:41.338820 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.338778 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p85jd\" (UniqueName: \"kubernetes.io/projected/726ec92c-3d20-4f47-b117-123c1c25d656-kube-api-access-p85jd\") pod \"authorino-operator-657f44b778-68hkw\" (UID: \"726ec92c-3d20-4f47-b117-123c1c25d656\") " pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:51:41.352679 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.352645 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p85jd\" (UniqueName: \"kubernetes.io/projected/726ec92c-3d20-4f47-b117-123c1c25d656-kube-api-access-p85jd\") pod \"authorino-operator-657f44b778-68hkw\" (UID: \"726ec92c-3d20-4f47-b117-123c1c25d656\") " pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:51:41.470505 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.470398 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:51:41.591279 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.591246 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-657f44b778-68hkw"] Apr 22 16:51:41.595036 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:51:41.594997 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod726ec92c_3d20_4f47_b117_123c1c25d656.slice/crio-9952e8271874e5284ed27ec72d140083a572df32a159c7bf5bc974be16616b13 WatchSource:0}: Error finding container 9952e8271874e5284ed27ec72d140083a572df32a159c7bf5bc974be16616b13: Status 404 returned error can't find the container with id 9952e8271874e5284ed27ec72d140083a572df32a159c7bf5bc974be16616b13 Apr 22 16:51:41.753823 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:41.753737 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" event={"ID":"726ec92c-3d20-4f47-b117-123c1c25d656","Type":"ContainerStarted","Data":"9952e8271874e5284ed27ec72d140083a572df32a159c7bf5bc974be16616b13"} Apr 22 16:51:43.760531 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:43.760495 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" event={"ID":"726ec92c-3d20-4f47-b117-123c1c25d656","Type":"ContainerStarted","Data":"87d5c97fa31174922d7186a62bb3ed25b33e4eb39f358fd21b96563dfcd01ea4"} Apr 22 16:51:43.760992 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:43.760593 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:51:43.788315 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:43.788243 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" podStartSLOduration=0.792673655 podStartE2EDuration="2.788224338s" podCreationTimestamp="2026-04-22 16:51:41 +0000 UTC" firstStartedPulling="2026-04-22 16:51:41.597323936 +0000 UTC m=+1827.134054628" lastFinishedPulling="2026-04-22 16:51:43.592874607 +0000 UTC m=+1829.129605311" observedRunningTime="2026-04-22 16:51:43.786270148 +0000 UTC m=+1829.323000870" watchObservedRunningTime="2026-04-22 16:51:43.788224338 +0000 UTC m=+1829.324955054" Apr 22 16:51:49.301654 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.301621 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5"] Apr 22 16:51:49.304674 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.304653 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:51:49.307418 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.307392 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-env\"" Apr 22 16:51:49.308323 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.308299 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"dns-operator-controller-manager-dockercfg-xsltq\"" Apr 22 16:51:49.315743 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.315721 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5"] Apr 22 16:51:49.400106 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.400066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hngl7\" (UniqueName: \"kubernetes.io/projected/4e99c593-4805-4eb9-9a96-92377e26f40c-kube-api-access-hngl7\") pod \"dns-operator-controller-manager-648d5c98bc-gw5k5\" (UID: \"4e99c593-4805-4eb9-9a96-92377e26f40c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:51:49.501042 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.501012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hngl7\" (UniqueName: \"kubernetes.io/projected/4e99c593-4805-4eb9-9a96-92377e26f40c-kube-api-access-hngl7\") pod \"dns-operator-controller-manager-648d5c98bc-gw5k5\" (UID: \"4e99c593-4805-4eb9-9a96-92377e26f40c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:51:49.509224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.509191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hngl7\" (UniqueName: \"kubernetes.io/projected/4e99c593-4805-4eb9-9a96-92377e26f40c-kube-api-access-hngl7\") pod \"dns-operator-controller-manager-648d5c98bc-gw5k5\" (UID: \"4e99c593-4805-4eb9-9a96-92377e26f40c\") " pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:51:49.614948 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.614911 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:51:49.732657 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.732625 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5"] Apr 22 16:51:49.735976 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:51:49.735945 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e99c593_4805_4eb9_9a96_92377e26f40c.slice/crio-6b898d096557f5093c572f57727e6a2e88a2a5a710e14c3ee014bdb3354b40d5 WatchSource:0}: Error finding container 6b898d096557f5093c572f57727e6a2e88a2a5a710e14c3ee014bdb3354b40d5: Status 404 returned error can't find the container with id 6b898d096557f5093c572f57727e6a2e88a2a5a710e14c3ee014bdb3354b40d5 Apr 22 16:51:49.785937 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:49.785902 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" event={"ID":"4e99c593-4805-4eb9-9a96-92377e26f40c","Type":"ContainerStarted","Data":"6b898d096557f5093c572f57727e6a2e88a2a5a710e14c3ee014bdb3354b40d5"} Apr 22 16:51:52.796885 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:52.796836 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" event={"ID":"4e99c593-4805-4eb9-9a96-92377e26f40c","Type":"ContainerStarted","Data":"0ef8e6db87542659dcd9a151dc3d73d2d1899ee0b2783c8e16ab183e69985b67"} Apr 22 16:51:52.797265 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:52.796935 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:51:52.816121 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:52.816074 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" podStartSLOduration=1.286851838 podStartE2EDuration="3.816059518s" podCreationTimestamp="2026-04-22 16:51:49 +0000 UTC" firstStartedPulling="2026-04-22 16:51:49.737681181 +0000 UTC m=+1835.274411874" lastFinishedPulling="2026-04-22 16:51:52.266888853 +0000 UTC m=+1837.803619554" observedRunningTime="2026-04-22 16:51:52.814519058 +0000 UTC m=+1838.351249769" watchObservedRunningTime="2026-04-22 16:51:52.816059518 +0000 UTC m=+1838.352790278" Apr 22 16:51:54.766078 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:51:54.766049 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-657f44b778-68hkw" Apr 22 16:52:03.802736 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:03.802705 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/dns-operator-controller-manager-648d5c98bc-gw5k5" Apr 22 16:52:04.956185 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:04.956144 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l"] Apr 22 16:52:04.959627 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:04.959605 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:04.961916 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:04.961894 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-75bhj\"" Apr 22 16:52:04.976248 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:04.976223 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l"] Apr 22 16:52:05.025716 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.025690 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flb6q\" (UniqueName: \"kubernetes.io/projected/4f3d2b07-5e75-4820-8da0-afd55235c6ed-kube-api-access-flb6q\") pod \"kuadrant-operator-controller-manager-84b657d985-sz92l\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.025891 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.025733 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f3d2b07-5e75-4820-8da0-afd55235c6ed-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sz92l\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.126289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.126243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flb6q\" (UniqueName: \"kubernetes.io/projected/4f3d2b07-5e75-4820-8da0-afd55235c6ed-kube-api-access-flb6q\") pod \"kuadrant-operator-controller-manager-84b657d985-sz92l\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.126662 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.126634 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f3d2b07-5e75-4820-8da0-afd55235c6ed-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sz92l\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.127036 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.127017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f3d2b07-5e75-4820-8da0-afd55235c6ed-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-sz92l\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.149932 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.148285 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flb6q\" (UniqueName: \"kubernetes.io/projected/4f3d2b07-5e75-4820-8da0-afd55235c6ed-kube-api-access-flb6q\") pod \"kuadrant-operator-controller-manager-84b657d985-sz92l\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.262747 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.262672 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l"] Apr 22 16:52:05.262940 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.262927 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:05.271036 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.271007 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l"] Apr 22 16:52:05.290029 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.289998 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm"] Apr 22 16:52:05.294581 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.294550 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.309329 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.309303 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm"] Apr 22 16:52:05.327940 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.327908 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4tc\" (UniqueName: \"kubernetes.io/projected/30d54422-9254-4054-ab95-e81ed0c643ad-kube-api-access-vg4tc\") pod \"kuadrant-operator-controller-manager-84b657d985-jbskm\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.328092 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.327967 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30d54422-9254-4054-ab95-e81ed0c643ad-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-jbskm\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.429090 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.429051 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4tc\" (UniqueName: \"kubernetes.io/projected/30d54422-9254-4054-ab95-e81ed0c643ad-kube-api-access-vg4tc\") pod \"kuadrant-operator-controller-manager-84b657d985-jbskm\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.429261 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.429108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30d54422-9254-4054-ab95-e81ed0c643ad-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-jbskm\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.429446 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.429431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30d54422-9254-4054-ab95-e81ed0c643ad-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-84b657d985-jbskm\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.439637 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.439604 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4tc\" (UniqueName: \"kubernetes.io/projected/30d54422-9254-4054-ab95-e81ed0c643ad-kube-api-access-vg4tc\") pod \"kuadrant-operator-controller-manager-84b657d985-jbskm\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.609250 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.609197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:05.731334 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.731300 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm"] Apr 22 16:52:05.734305 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:52:05.734281 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30d54422_9254_4054_ab95_e81ed0c643ad.slice/crio-144fa06e85ab8ea0ba23e3e68b6eaf1950d2aefa714b9d43ac94be5d8161c652 WatchSource:0}: Error finding container 144fa06e85ab8ea0ba23e3e68b6eaf1950d2aefa714b9d43ac94be5d8161c652: Status 404 returned error can't find the container with id 144fa06e85ab8ea0ba23e3e68b6eaf1950d2aefa714b9d43ac94be5d8161c652 Apr 22 16:52:05.838396 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:05.838354 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" event={"ID":"30d54422-9254-4054-ab95-e81ed0c643ad","Type":"ContainerStarted","Data":"144fa06e85ab8ea0ba23e3e68b6eaf1950d2aefa714b9d43ac94be5d8161c652"} Apr 22 16:52:07.794214 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:52:07.794174 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3d2b07_5e75_4820_8da0_afd55235c6ed.slice/crio-7fa190a56bf6e04a5dc5ba6dd4eb3877fc8bd12a0168498da4e13b4306da2016 WatchSource:0}: Error finding container 7fa190a56bf6e04a5dc5ba6dd4eb3877fc8bd12a0168498da4e13b4306da2016: Status 404 returned error can't find the container with id 7fa190a56bf6e04a5dc5ba6dd4eb3877fc8bd12a0168498da4e13b4306da2016 Apr 22 16:52:10.858101 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.858065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" event={"ID":"30d54422-9254-4054-ab95-e81ed0c643ad","Type":"ContainerStarted","Data":"adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb"} Apr 22 16:52:10.858538 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.858137 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:10.859616 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.859595 2578 generic.go:358] "Generic (PLEG): container finished" podID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" containerID="38fdf4c06c54177279c38eb6973b546aa3bac1e72eb2205c29d76322d183c925" exitCode=1 Apr 22 16:52:10.877370 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.877323 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" podStartSLOduration=1.612926176 podStartE2EDuration="5.877305783s" podCreationTimestamp="2026-04-22 16:52:05 +0000 UTC" firstStartedPulling="2026-04-22 16:52:05.736513461 +0000 UTC m=+1851.273244151" lastFinishedPulling="2026-04-22 16:52:10.000893062 +0000 UTC m=+1855.537623758" observedRunningTime="2026-04-22 16:52:10.875874355 +0000 UTC m=+1856.412605062" watchObservedRunningTime="2026-04-22 16:52:10.877305783 +0000 UTC m=+1856.414036497" Apr 22 16:52:10.890378 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.890357 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:10.892401 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.892374 2578 status_manager.go:895] "Failed to get status for pod" podUID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" err="pods \"kuadrant-operator-controller-manager-84b657d985-sz92l\" is forbidden: User \"system:node:ip-10-0-137-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-144.ec2.internal' and this object" Apr 22 16:52:10.980381 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.980348 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f3d2b07-5e75-4820-8da0-afd55235c6ed-extensions-socket-volume\") pod \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " Apr 22 16:52:10.980565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.980455 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flb6q\" (UniqueName: \"kubernetes.io/projected/4f3d2b07-5e75-4820-8da0-afd55235c6ed-kube-api-access-flb6q\") pod \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\" (UID: \"4f3d2b07-5e75-4820-8da0-afd55235c6ed\") " Apr 22 16:52:10.980643 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.980618 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3d2b07-5e75-4820-8da0-afd55235c6ed-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "4f3d2b07-5e75-4820-8da0-afd55235c6ed" (UID: "4f3d2b07-5e75-4820-8da0-afd55235c6ed"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:52:10.982566 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:10.982538 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3d2b07-5e75-4820-8da0-afd55235c6ed-kube-api-access-flb6q" (OuterVolumeSpecName: "kube-api-access-flb6q") pod "4f3d2b07-5e75-4820-8da0-afd55235c6ed" (UID: "4f3d2b07-5e75-4820-8da0-afd55235c6ed"). InnerVolumeSpecName "kube-api-access-flb6q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:11.020442 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:11.020409 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" path="/var/lib/kubelet/pods/4f3d2b07-5e75-4820-8da0-afd55235c6ed/volumes" Apr 22 16:52:11.081429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:11.081389 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-flb6q\" (UniqueName: \"kubernetes.io/projected/4f3d2b07-5e75-4820-8da0-afd55235c6ed-kube-api-access-flb6q\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:52:11.081429 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:11.081420 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/4f3d2b07-5e75-4820-8da0-afd55235c6ed-extensions-socket-volume\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:52:11.864038 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:11.864004 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" Apr 22 16:52:11.864523 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:11.864039 2578 scope.go:117] "RemoveContainer" containerID="38fdf4c06c54177279c38eb6973b546aa3bac1e72eb2205c29d76322d183c925" Apr 22 16:52:11.870271 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:11.870243 2578 status_manager.go:895] "Failed to get status for pod" podUID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-sz92l" err="pods \"kuadrant-operator-controller-manager-84b657d985-sz92l\" is forbidden: User \"system:node:ip-10-0-137-144.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kuadrant-system\": no relationship found between node 'ip-10-0-137-144.ec2.internal' and this object" Apr 22 16:52:21.866766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:21.866731 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:34.670946 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.670861 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm"] Apr 22 16:52:34.671402 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.671154 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" podUID="30d54422-9254-4054-ab95-e81ed0c643ad" containerName="manager" containerID="cri-o://adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb" gracePeriod=10 Apr 22 16:52:34.917397 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.917374 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:34.939658 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.939593 2578 generic.go:358] "Generic (PLEG): container finished" podID="30d54422-9254-4054-ab95-e81ed0c643ad" containerID="adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb" exitCode=0 Apr 22 16:52:34.939766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.939656 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" Apr 22 16:52:34.939766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.939678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" event={"ID":"30d54422-9254-4054-ab95-e81ed0c643ad","Type":"ContainerDied","Data":"adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb"} Apr 22 16:52:34.939766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.939718 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm" event={"ID":"30d54422-9254-4054-ab95-e81ed0c643ad","Type":"ContainerDied","Data":"144fa06e85ab8ea0ba23e3e68b6eaf1950d2aefa714b9d43ac94be5d8161c652"} Apr 22 16:52:34.939766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.939738 2578 scope.go:117] "RemoveContainer" containerID="adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb" Apr 22 16:52:34.950584 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.950558 2578 scope.go:117] "RemoveContainer" containerID="adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb" Apr 22 16:52:34.950889 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:52:34.950864 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb\": container with ID starting with adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb not found: ID does not exist" containerID="adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb" Apr 22 16:52:34.950968 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:34.950899 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb"} err="failed to get container status \"adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb\": rpc error: code = NotFound desc = could not find container \"adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb\": container with ID starting with adb8ed654838570daa57a1f847790c33c4c9551943fa165bd57400f1fdb539cb not found: ID does not exist" Apr 22 16:52:35.074173 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.074143 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg4tc\" (UniqueName: \"kubernetes.io/projected/30d54422-9254-4054-ab95-e81ed0c643ad-kube-api-access-vg4tc\") pod \"30d54422-9254-4054-ab95-e81ed0c643ad\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " Apr 22 16:52:35.074357 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.074270 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30d54422-9254-4054-ab95-e81ed0c643ad-extensions-socket-volume\") pod \"30d54422-9254-4054-ab95-e81ed0c643ad\" (UID: \"30d54422-9254-4054-ab95-e81ed0c643ad\") " Apr 22 16:52:35.074634 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.074607 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30d54422-9254-4054-ab95-e81ed0c643ad-extensions-socket-volume" (OuterVolumeSpecName: "extensions-socket-volume") pod "30d54422-9254-4054-ab95-e81ed0c643ad" (UID: "30d54422-9254-4054-ab95-e81ed0c643ad"). InnerVolumeSpecName "extensions-socket-volume". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 16:52:35.076209 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.076182 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30d54422-9254-4054-ab95-e81ed0c643ad-kube-api-access-vg4tc" (OuterVolumeSpecName: "kube-api-access-vg4tc") pod "30d54422-9254-4054-ab95-e81ed0c643ad" (UID: "30d54422-9254-4054-ab95-e81ed0c643ad"). InnerVolumeSpecName "kube-api-access-vg4tc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:35.174798 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.174764 2578 reconciler_common.go:299] "Volume detached for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/30d54422-9254-4054-ab95-e81ed0c643ad-extensions-socket-volume\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:52:35.174798 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.174794 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vg4tc\" (UniqueName: \"kubernetes.io/projected/30d54422-9254-4054-ab95-e81ed0c643ad-kube-api-access-vg4tc\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:52:35.272622 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.272589 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm"] Apr 22 16:52:35.279328 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:35.279304 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-84b657d985-jbskm"] Apr 22 16:52:37.020810 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:37.020775 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30d54422-9254-4054-ab95-e81ed0c643ad" path="/var/lib/kubelet/pods/30d54422-9254-4054-ab95-e81ed0c643ad/volumes" Apr 22 16:52:42.186339 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186255 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:52:42.186713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186541 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" containerName="manager" Apr 22 16:52:42.186713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186552 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" containerName="manager" Apr 22 16:52:42.186713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186570 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30d54422-9254-4054-ab95-e81ed0c643ad" containerName="manager" Apr 22 16:52:42.186713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186576 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="30d54422-9254-4054-ab95-e81ed0c643ad" containerName="manager" Apr 22 16:52:42.186713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186626 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="30d54422-9254-4054-ab95-e81ed0c643ad" containerName="manager" Apr 22 16:52:42.186713 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.186633 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="4f3d2b07-5e75-4820-8da0-afd55235c6ed" containerName="manager" Apr 22 16:52:42.193654 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.193631 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.195917 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.195888 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 22 16:52:42.195917 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.195903 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9455s\"" Apr 22 16:52:42.196564 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.196545 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:52:42.284889 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.284858 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:52:42.328115 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.328084 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9c00c9a0-ef99-4341-9547-4f147e9961d6-config-file\") pod \"limitador-limitador-7d549b5b-bnq6l\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.328272 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.328128 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmmrr\" (UniqueName: \"kubernetes.io/projected/9c00c9a0-ef99-4341-9547-4f147e9961d6-kube-api-access-fmmrr\") pod \"limitador-limitador-7d549b5b-bnq6l\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.429176 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.429139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9c00c9a0-ef99-4341-9547-4f147e9961d6-config-file\") pod \"limitador-limitador-7d549b5b-bnq6l\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.429355 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.429187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmmrr\" (UniqueName: \"kubernetes.io/projected/9c00c9a0-ef99-4341-9547-4f147e9961d6-kube-api-access-fmmrr\") pod \"limitador-limitador-7d549b5b-bnq6l\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.429768 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.429747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9c00c9a0-ef99-4341-9547-4f147e9961d6-config-file\") pod \"limitador-limitador-7d549b5b-bnq6l\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.436890 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.436807 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmmrr\" (UniqueName: \"kubernetes.io/projected/9c00c9a0-ef99-4341-9547-4f147e9961d6-kube-api-access-fmmrr\") pod \"limitador-limitador-7d549b5b-bnq6l\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.503970 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.503937 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:42.630095 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.630048 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:52:42.632869 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:52:42.632819 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c00c9a0_ef99_4341_9547_4f147e9961d6.slice/crio-996940402ca584c43e56784be94067b98b61aa50d0b96da249f46febdfbb523f WatchSource:0}: Error finding container 996940402ca584c43e56784be94067b98b61aa50d0b96da249f46febdfbb523f: Status 404 returned error can't find the container with id 996940402ca584c43e56784be94067b98b61aa50d0b96da249f46febdfbb523f Apr 22 16:52:42.969084 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:42.969041 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" event={"ID":"9c00c9a0-ef99-4341-9547-4f147e9961d6","Type":"ContainerStarted","Data":"996940402ca584c43e56784be94067b98b61aa50d0b96da249f46febdfbb523f"} Apr 22 16:52:45.980660 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:45.980623 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" event={"ID":"9c00c9a0-ef99-4341-9547-4f147e9961d6","Type":"ContainerStarted","Data":"90e93c162fe41c333a22cfe43ccf2274ffe2708bc24f373cc0be1a04e1542212"} Apr 22 16:52:45.981056 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:45.980711 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:45.997289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:45.997236 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" podStartSLOduration=1.201825581 podStartE2EDuration="3.997221722s" podCreationTimestamp="2026-04-22 16:52:42 +0000 UTC" firstStartedPulling="2026-04-22 16:52:42.634651873 +0000 UTC m=+1888.171382578" lastFinishedPulling="2026-04-22 16:52:45.43004802 +0000 UTC m=+1890.966778719" observedRunningTime="2026-04-22 16:52:45.994950662 +0000 UTC m=+1891.531681393" watchObservedRunningTime="2026-04-22 16:52:45.997221722 +0000 UTC m=+1891.533952434" Apr 22 16:52:56.984564 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:56.984528 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:57.192427 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:57.192390 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:52:57.192610 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:57.192588 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" podUID="9c00c9a0-ef99-4341-9547-4f147e9961d6" containerName="limitador" containerID="cri-o://90e93c162fe41c333a22cfe43ccf2274ffe2708bc24f373cc0be1a04e1542212" gracePeriod=30 Apr 22 16:52:58.020694 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.020659 2578 generic.go:358] "Generic (PLEG): container finished" podID="9c00c9a0-ef99-4341-9547-4f147e9961d6" containerID="90e93c162fe41c333a22cfe43ccf2274ffe2708bc24f373cc0be1a04e1542212" exitCode=0 Apr 22 16:52:58.021068 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.020706 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" event={"ID":"9c00c9a0-ef99-4341-9547-4f147e9961d6","Type":"ContainerDied","Data":"90e93c162fe41c333a22cfe43ccf2274ffe2708bc24f373cc0be1a04e1542212"} Apr 22 16:52:58.141914 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.141892 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:58.186151 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.186121 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/postgres-868db5846d-d29b2"] Apr 22 16:52:58.186438 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.186424 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9c00c9a0-ef99-4341-9547-4f147e9961d6" containerName="limitador" Apr 22 16:52:58.186482 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.186439 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c00c9a0-ef99-4341-9547-4f147e9961d6" containerName="limitador" Apr 22 16:52:58.186515 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.186509 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9c00c9a0-ef99-4341-9547-4f147e9961d6" containerName="limitador" Apr 22 16:52:58.190088 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.190062 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.193801 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.193776 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"default-dockercfg-lqjqk\"" Apr 22 16:52:58.194294 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.194274 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"postgres-creds\"" Apr 22 16:52:58.203742 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.203717 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-d29b2"] Apr 22 16:52:58.239687 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.239653 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/af1ddb96-1d08-448b-92f2-3ac60dc191d9-data\") pod \"postgres-868db5846d-d29b2\" (UID: \"af1ddb96-1d08-448b-92f2-3ac60dc191d9\") " pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.239869 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.239731 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn4xv\" (UniqueName: \"kubernetes.io/projected/af1ddb96-1d08-448b-92f2-3ac60dc191d9-kube-api-access-bn4xv\") pod \"postgres-868db5846d-d29b2\" (UID: \"af1ddb96-1d08-448b-92f2-3ac60dc191d9\") " pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.340610 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.340577 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmmrr\" (UniqueName: \"kubernetes.io/projected/9c00c9a0-ef99-4341-9547-4f147e9961d6-kube-api-access-fmmrr\") pod \"9c00c9a0-ef99-4341-9547-4f147e9961d6\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " Apr 22 16:52:58.340768 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.340628 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9c00c9a0-ef99-4341-9547-4f147e9961d6-config-file\") pod \"9c00c9a0-ef99-4341-9547-4f147e9961d6\" (UID: \"9c00c9a0-ef99-4341-9547-4f147e9961d6\") " Apr 22 16:52:58.340768 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.340744 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/af1ddb96-1d08-448b-92f2-3ac60dc191d9-data\") pod \"postgres-868db5846d-d29b2\" (UID: \"af1ddb96-1d08-448b-92f2-3ac60dc191d9\") " pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.340880 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.340779 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bn4xv\" (UniqueName: \"kubernetes.io/projected/af1ddb96-1d08-448b-92f2-3ac60dc191d9-kube-api-access-bn4xv\") pod \"postgres-868db5846d-d29b2\" (UID: \"af1ddb96-1d08-448b-92f2-3ac60dc191d9\") " pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.341068 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.341040 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c00c9a0-ef99-4341-9547-4f147e9961d6-config-file" (OuterVolumeSpecName: "config-file") pod "9c00c9a0-ef99-4341-9547-4f147e9961d6" (UID: "9c00c9a0-ef99-4341-9547-4f147e9961d6"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 16:52:58.341204 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.341185 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/af1ddb96-1d08-448b-92f2-3ac60dc191d9-data\") pod \"postgres-868db5846d-d29b2\" (UID: \"af1ddb96-1d08-448b-92f2-3ac60dc191d9\") " pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.342719 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.342695 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c00c9a0-ef99-4341-9547-4f147e9961d6-kube-api-access-fmmrr" (OuterVolumeSpecName: "kube-api-access-fmmrr") pod "9c00c9a0-ef99-4341-9547-4f147e9961d6" (UID: "9c00c9a0-ef99-4341-9547-4f147e9961d6"). InnerVolumeSpecName "kube-api-access-fmmrr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:52:58.350984 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.350958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn4xv\" (UniqueName: \"kubernetes.io/projected/af1ddb96-1d08-448b-92f2-3ac60dc191d9-kube-api-access-bn4xv\") pod \"postgres-868db5846d-d29b2\" (UID: \"af1ddb96-1d08-448b-92f2-3ac60dc191d9\") " pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.441434 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.441388 2578 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/9c00c9a0-ef99-4341-9547-4f147e9961d6-config-file\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:52:58.441434 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.441428 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmmrr\" (UniqueName: \"kubernetes.io/projected/9c00c9a0-ef99-4341-9547-4f147e9961d6-kube-api-access-fmmrr\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:52:58.501413 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.501377 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:52:58.623432 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:58.623401 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/postgres-868db5846d-d29b2"] Apr 22 16:52:58.625657 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:52:58.625621 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf1ddb96_1d08_448b_92f2_3ac60dc191d9.slice/crio-60fa93ad730dbb6b0269fb9a4a78a78cf443c704f80ddfe33c65a52275b1c152 WatchSource:0}: Error finding container 60fa93ad730dbb6b0269fb9a4a78a78cf443c704f80ddfe33c65a52275b1c152: Status 404 returned error can't find the container with id 60fa93ad730dbb6b0269fb9a4a78a78cf443c704f80ddfe33c65a52275b1c152 Apr 22 16:52:59.024653 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:59.024569 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-d29b2" event={"ID":"af1ddb96-1d08-448b-92f2-3ac60dc191d9","Type":"ContainerStarted","Data":"60fa93ad730dbb6b0269fb9a4a78a78cf443c704f80ddfe33c65a52275b1c152"} Apr 22 16:52:59.025724 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:59.025697 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" Apr 22 16:52:59.025877 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:59.025699 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-7d549b5b-bnq6l" event={"ID":"9c00c9a0-ef99-4341-9547-4f147e9961d6","Type":"ContainerDied","Data":"996940402ca584c43e56784be94067b98b61aa50d0b96da249f46febdfbb523f"} Apr 22 16:52:59.025877 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:59.025813 2578 scope.go:117] "RemoveContainer" containerID="90e93c162fe41c333a22cfe43ccf2274ffe2708bc24f373cc0be1a04e1542212" Apr 22 16:52:59.044012 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:59.043987 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:52:59.048468 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:52:59.048446 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-7d549b5b-bnq6l"] Apr 22 16:53:01.021956 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:01.021924 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c00c9a0-ef99-4341-9547-4f147e9961d6" path="/var/lib/kubelet/pods/9c00c9a0-ef99-4341-9547-4f147e9961d6/volumes" Apr 22 16:53:04.048590 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:04.048549 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/postgres-868db5846d-d29b2" event={"ID":"af1ddb96-1d08-448b-92f2-3ac60dc191d9","Type":"ContainerStarted","Data":"43b949958110cdece356c37b6efaadd0d31d862b91f9f20c4d6848f8df7d5047"} Apr 22 16:53:04.049060 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:04.048658 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:53:04.065401 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:04.065358 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/postgres-868db5846d-d29b2" podStartSLOduration=1.26792818 podStartE2EDuration="6.065343911s" podCreationTimestamp="2026-04-22 16:52:58 +0000 UTC" firstStartedPulling="2026-04-22 16:52:58.626983972 +0000 UTC m=+1904.163714664" lastFinishedPulling="2026-04-22 16:53:03.424399689 +0000 UTC m=+1908.961130395" observedRunningTime="2026-04-22 16:53:04.063225171 +0000 UTC m=+1909.599955883" watchObservedRunningTime="2026-04-22 16:53:04.065343911 +0000 UTC m=+1909.602074629" Apr 22 16:53:10.080423 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.080394 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/postgres-868db5846d-d29b2" Apr 22 16:53:10.599464 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.599423 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4rcbc"] Apr 22 16:53:10.605650 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.605627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:10.607972 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.607950 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-wgb6c\"" Apr 22 16:53:10.610867 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.610822 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4rcbc"] Apr 22 16:53:10.641174 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.641144 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbv9q\" (UniqueName: \"kubernetes.io/projected/ec410859-46dd-499c-a8a7-266651afbe7d-kube-api-access-sbv9q\") pod \"authorino-8b475cf9f-4rcbc\" (UID: \"ec410859-46dd-499c-a8a7-266651afbe7d\") " pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:10.742352 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.742313 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbv9q\" (UniqueName: \"kubernetes.io/projected/ec410859-46dd-499c-a8a7-266651afbe7d-kube-api-access-sbv9q\") pod \"authorino-8b475cf9f-4rcbc\" (UID: \"ec410859-46dd-499c-a8a7-266651afbe7d\") " pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:10.750563 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.750536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbv9q\" (UniqueName: \"kubernetes.io/projected/ec410859-46dd-499c-a8a7-266651afbe7d-kube-api-access-sbv9q\") pod \"authorino-8b475cf9f-4rcbc\" (UID: \"ec410859-46dd-499c-a8a7-266651afbe7d\") " pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:10.820187 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.820152 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4rcbc"] Apr 22 16:53:10.820379 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.820368 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:10.955421 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:10.955397 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4rcbc"] Apr 22 16:53:10.958103 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:53:10.958067 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podec410859_46dd_499c_a8a7_266651afbe7d.slice/crio-845e5e0ba2dfed9f53e38cf27591e1f4fc7602938ae29f2ef3b500a638d205ec WatchSource:0}: Error finding container 845e5e0ba2dfed9f53e38cf27591e1f4fc7602938ae29f2ef3b500a638d205ec: Status 404 returned error can't find the container with id 845e5e0ba2dfed9f53e38cf27591e1f4fc7602938ae29f2ef3b500a638d205ec Apr 22 16:53:11.072682 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:11.072642 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" event={"ID":"ec410859-46dd-499c-a8a7-266651afbe7d","Type":"ContainerStarted","Data":"845e5e0ba2dfed9f53e38cf27591e1f4fc7602938ae29f2ef3b500a638d205ec"} Apr 22 16:53:13.621724 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.621686 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-44mcg"] Apr 22 16:53:13.654051 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.653980 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-44mcg"] Apr 22 16:53:13.654227 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.654156 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:13.656711 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.656689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-controller-dockercfg-p2k84\"" Apr 22 16:53:13.768888 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.768829 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jx94\" (UniqueName: \"kubernetes.io/projected/01db457c-10b5-4f7c-b804-5f47a1a8b31e-kube-api-access-7jx94\") pod \"maas-controller-6d4c8f55f9-44mcg\" (UID: \"01db457c-10b5-4f7c-b804-5f47a1a8b31e\") " pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:13.777767 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.777733 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-6d67ffbc94-h8zch"] Apr 22 16:53:13.808933 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.808900 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d67ffbc94-h8zch"] Apr 22 16:53:13.809104 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.809027 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:13.869436 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.869394 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x8kw\" (UniqueName: \"kubernetes.io/projected/b4348053-04df-4546-a7ac-cfa4af7500c9-kube-api-access-8x8kw\") pod \"maas-controller-6d67ffbc94-h8zch\" (UID: \"b4348053-04df-4546-a7ac-cfa4af7500c9\") " pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:13.869634 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.869518 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jx94\" (UniqueName: \"kubernetes.io/projected/01db457c-10b5-4f7c-b804-5f47a1a8b31e-kube-api-access-7jx94\") pod \"maas-controller-6d4c8f55f9-44mcg\" (UID: \"01db457c-10b5-4f7c-b804-5f47a1a8b31e\") " pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:13.881827 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.881761 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jx94\" (UniqueName: \"kubernetes.io/projected/01db457c-10b5-4f7c-b804-5f47a1a8b31e-kube-api-access-7jx94\") pod \"maas-controller-6d4c8f55f9-44mcg\" (UID: \"01db457c-10b5-4f7c-b804-5f47a1a8b31e\") " pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:13.895123 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.895089 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-44mcg"] Apr 22 16:53:13.895389 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.895374 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:13.923115 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.923079 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-844cfb9854-82djs"] Apr 22 16:53:13.932663 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.932633 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:13.934371 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.934343 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-844cfb9854-82djs"] Apr 22 16:53:13.970511 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.970470 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x8kw\" (UniqueName: \"kubernetes.io/projected/b4348053-04df-4546-a7ac-cfa4af7500c9-kube-api-access-8x8kw\") pod \"maas-controller-6d67ffbc94-h8zch\" (UID: \"b4348053-04df-4546-a7ac-cfa4af7500c9\") " pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:13.970690 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.970531 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5778\" (UniqueName: \"kubernetes.io/projected/a9e01385-9d4f-49a5-9a5d-9db7249b9698-kube-api-access-x5778\") pod \"maas-controller-844cfb9854-82djs\" (UID: \"a9e01385-9d4f-49a5-9a5d-9db7249b9698\") " pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:13.978479 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:13.978453 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x8kw\" (UniqueName: \"kubernetes.io/projected/b4348053-04df-4546-a7ac-cfa4af7500c9-kube-api-access-8x8kw\") pod \"maas-controller-6d67ffbc94-h8zch\" (UID: \"b4348053-04df-4546-a7ac-cfa4af7500c9\") " pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:14.071940 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.071901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x5778\" (UniqueName: \"kubernetes.io/projected/a9e01385-9d4f-49a5-9a5d-9db7249b9698-kube-api-access-x5778\") pod \"maas-controller-844cfb9854-82djs\" (UID: \"a9e01385-9d4f-49a5-9a5d-9db7249b9698\") " pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:14.080507 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.080476 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5778\" (UniqueName: \"kubernetes.io/projected/a9e01385-9d4f-49a5-9a5d-9db7249b9698-kube-api-access-x5778\") pod \"maas-controller-844cfb9854-82djs\" (UID: \"a9e01385-9d4f-49a5-9a5d-9db7249b9698\") " pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:14.124409 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.124367 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:14.246971 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.246891 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:14.664467 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.664440 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-844cfb9854-82djs"] Apr 22 16:53:14.667461 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:53:14.667426 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9e01385_9d4f_49a5_9a5d_9db7249b9698.slice/crio-de17b4d33063a852fcaee5462c0d04e6e47d7f7464f4d9d5402442c69e1b54ed WatchSource:0}: Error finding container de17b4d33063a852fcaee5462c0d04e6e47d7f7464f4d9d5402442c69e1b54ed: Status 404 returned error can't find the container with id de17b4d33063a852fcaee5462c0d04e6e47d7f7464f4d9d5402442c69e1b54ed Apr 22 16:53:14.686968 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.686938 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-44mcg"] Apr 22 16:53:14.688907 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:53:14.688812 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01db457c_10b5_4f7c_b804_5f47a1a8b31e.slice/crio-c3690ce20fe8c80db796a1091065378baaf315b35b6e81327047a6d3a423419f WatchSource:0}: Error finding container c3690ce20fe8c80db796a1091065378baaf315b35b6e81327047a6d3a423419f: Status 404 returned error can't find the container with id c3690ce20fe8c80db796a1091065378baaf315b35b6e81327047a6d3a423419f Apr 22 16:53:14.708565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:14.708538 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-6d67ffbc94-h8zch"] Apr 22 16:53:14.711005 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:53:14.710974 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4348053_04df_4546_a7ac_cfa4af7500c9.slice/crio-d8d6ce0f2e72d19eecb94b320aa4bcca2d6efe7ac015482c666cf7525fda4e6f WatchSource:0}: Error finding container d8d6ce0f2e72d19eecb94b320aa4bcca2d6efe7ac015482c666cf7525fda4e6f: Status 404 returned error can't find the container with id d8d6ce0f2e72d19eecb94b320aa4bcca2d6efe7ac015482c666cf7525fda4e6f Apr 22 16:53:15.090520 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.090484 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" event={"ID":"b4348053-04df-4546-a7ac-cfa4af7500c9","Type":"ContainerStarted","Data":"d8d6ce0f2e72d19eecb94b320aa4bcca2d6efe7ac015482c666cf7525fda4e6f"} Apr 22 16:53:15.091477 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.091447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-844cfb9854-82djs" event={"ID":"a9e01385-9d4f-49a5-9a5d-9db7249b9698","Type":"ContainerStarted","Data":"de17b4d33063a852fcaee5462c0d04e6e47d7f7464f4d9d5402442c69e1b54ed"} Apr 22 16:53:15.092469 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.092442 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" event={"ID":"01db457c-10b5-4f7c-b804-5f47a1a8b31e","Type":"ContainerStarted","Data":"c3690ce20fe8c80db796a1091065378baaf315b35b6e81327047a6d3a423419f"} Apr 22 16:53:15.093621 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.093601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" event={"ID":"ec410859-46dd-499c-a8a7-266651afbe7d","Type":"ContainerStarted","Data":"3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd"} Apr 22 16:53:15.093724 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.093702 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" podUID="ec410859-46dd-499c-a8a7-266651afbe7d" containerName="authorino" containerID="cri-o://3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd" gracePeriod=30 Apr 22 16:53:15.108965 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.108921 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" podStartSLOduration=1.475521118 podStartE2EDuration="5.108905606s" podCreationTimestamp="2026-04-22 16:53:10 +0000 UTC" firstStartedPulling="2026-04-22 16:53:10.959334947 +0000 UTC m=+1916.496065640" lastFinishedPulling="2026-04-22 16:53:14.592719421 +0000 UTC m=+1920.129450128" observedRunningTime="2026-04-22 16:53:15.1075629 +0000 UTC m=+1920.644293611" watchObservedRunningTime="2026-04-22 16:53:15.108905606 +0000 UTC m=+1920.645636317" Apr 22 16:53:15.394669 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.394402 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:15.486814 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.486776 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sbv9q\" (UniqueName: \"kubernetes.io/projected/ec410859-46dd-499c-a8a7-266651afbe7d-kube-api-access-sbv9q\") pod \"ec410859-46dd-499c-a8a7-266651afbe7d\" (UID: \"ec410859-46dd-499c-a8a7-266651afbe7d\") " Apr 22 16:53:15.499424 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.499379 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ec410859-46dd-499c-a8a7-266651afbe7d-kube-api-access-sbv9q" (OuterVolumeSpecName: "kube-api-access-sbv9q") pod "ec410859-46dd-499c-a8a7-266651afbe7d" (UID: "ec410859-46dd-499c-a8a7-266651afbe7d"). InnerVolumeSpecName "kube-api-access-sbv9q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:53:15.587826 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:15.587784 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sbv9q\" (UniqueName: \"kubernetes.io/projected/ec410859-46dd-499c-a8a7-266651afbe7d-kube-api-access-sbv9q\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:53:16.103033 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.102992 2578 generic.go:358] "Generic (PLEG): container finished" podID="ec410859-46dd-499c-a8a7-266651afbe7d" containerID="3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd" exitCode=0 Apr 22 16:53:16.103513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.103095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" event={"ID":"ec410859-46dd-499c-a8a7-266651afbe7d","Type":"ContainerDied","Data":"3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd"} Apr 22 16:53:16.103513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.103127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" event={"ID":"ec410859-46dd-499c-a8a7-266651afbe7d","Type":"ContainerDied","Data":"845e5e0ba2dfed9f53e38cf27591e1f4fc7602938ae29f2ef3b500a638d205ec"} Apr 22 16:53:16.103513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.103148 2578 scope.go:117] "RemoveContainer" containerID="3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd" Apr 22 16:53:16.103513 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.103289 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-8b475cf9f-4rcbc" Apr 22 16:53:16.130870 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.130830 2578 scope.go:117] "RemoveContainer" containerID="3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd" Apr 22 16:53:16.132076 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:53:16.132045 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd\": container with ID starting with 3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd not found: ID does not exist" containerID="3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd" Apr 22 16:53:16.132169 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.132090 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd"} err="failed to get container status \"3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd\": rpc error: code = NotFound desc = could not find container \"3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd\": container with ID starting with 3d9de651a5d691054683c0c341569f6ef9540e11ae31230ac6e81e0460ec77cd not found: ID does not exist" Apr 22 16:53:16.138665 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.138631 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4rcbc"] Apr 22 16:53:16.143839 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:16.143812 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-8b475cf9f-4rcbc"] Apr 22 16:53:17.021472 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:17.021434 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ec410859-46dd-499c-a8a7-266651afbe7d" path="/var/lib/kubelet/pods/ec410859-46dd-499c-a8a7-266651afbe7d/volumes" Apr 22 16:53:19.115865 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.115805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" event={"ID":"b4348053-04df-4546-a7ac-cfa4af7500c9","Type":"ContainerStarted","Data":"8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429"} Apr 22 16:53:19.116339 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.115914 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:19.117151 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.117130 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-844cfb9854-82djs" event={"ID":"a9e01385-9d4f-49a5-9a5d-9db7249b9698","Type":"ContainerStarted","Data":"80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422"} Apr 22 16:53:19.117250 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.117207 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:19.118275 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.118253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" event={"ID":"01db457c-10b5-4f7c-b804-5f47a1a8b31e","Type":"ContainerStarted","Data":"d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a"} Apr 22 16:53:19.118345 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.118320 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" podUID="01db457c-10b5-4f7c-b804-5f47a1a8b31e" containerName="manager" containerID="cri-o://d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a" gracePeriod=10 Apr 22 16:53:19.118392 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.118366 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:19.131592 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.131541 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" podStartSLOduration=2.70337724 podStartE2EDuration="6.131528724s" podCreationTimestamp="2026-04-22 16:53:13 +0000 UTC" firstStartedPulling="2026-04-22 16:53:14.712367478 +0000 UTC m=+1920.249098171" lastFinishedPulling="2026-04-22 16:53:18.140518951 +0000 UTC m=+1923.677249655" observedRunningTime="2026-04-22 16:53:19.130206996 +0000 UTC m=+1924.666937711" watchObservedRunningTime="2026-04-22 16:53:19.131528724 +0000 UTC m=+1924.668259437" Apr 22 16:53:19.144174 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.144126 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-844cfb9854-82djs" podStartSLOduration=2.663352262 podStartE2EDuration="6.144111655s" podCreationTimestamp="2026-04-22 16:53:13 +0000 UTC" firstStartedPulling="2026-04-22 16:53:14.669563042 +0000 UTC m=+1920.206293738" lastFinishedPulling="2026-04-22 16:53:18.150322203 +0000 UTC m=+1923.687053131" observedRunningTime="2026-04-22 16:53:19.143622735 +0000 UTC m=+1924.680353462" watchObservedRunningTime="2026-04-22 16:53:19.144111655 +0000 UTC m=+1924.680842368" Apr 22 16:53:19.158409 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.158288 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" podStartSLOduration=2.708593295 podStartE2EDuration="6.158272964s" podCreationTimestamp="2026-04-22 16:53:13 +0000 UTC" firstStartedPulling="2026-04-22 16:53:14.690393235 +0000 UTC m=+1920.227123940" lastFinishedPulling="2026-04-22 16:53:18.14007292 +0000 UTC m=+1923.676803609" observedRunningTime="2026-04-22 16:53:19.157599089 +0000 UTC m=+1924.694329801" watchObservedRunningTime="2026-04-22 16:53:19.158272964 +0000 UTC m=+1924.695003678" Apr 22 16:53:19.459351 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.459321 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:19.521671 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.521637 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7jx94\" (UniqueName: \"kubernetes.io/projected/01db457c-10b5-4f7c-b804-5f47a1a8b31e-kube-api-access-7jx94\") pod \"01db457c-10b5-4f7c-b804-5f47a1a8b31e\" (UID: \"01db457c-10b5-4f7c-b804-5f47a1a8b31e\") " Apr 22 16:53:19.523792 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.523760 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01db457c-10b5-4f7c-b804-5f47a1a8b31e-kube-api-access-7jx94" (OuterVolumeSpecName: "kube-api-access-7jx94") pod "01db457c-10b5-4f7c-b804-5f47a1a8b31e" (UID: "01db457c-10b5-4f7c-b804-5f47a1a8b31e"). InnerVolumeSpecName "kube-api-access-7jx94". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:53:19.622898 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:19.622867 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7jx94\" (UniqueName: \"kubernetes.io/projected/01db457c-10b5-4f7c-b804-5f47a1a8b31e-kube-api-access-7jx94\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:53:20.122310 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.122269 2578 generic.go:358] "Generic (PLEG): container finished" podID="01db457c-10b5-4f7c-b804-5f47a1a8b31e" containerID="d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a" exitCode=0 Apr 22 16:53:20.122762 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.122334 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" Apr 22 16:53:20.122762 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.122355 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" event={"ID":"01db457c-10b5-4f7c-b804-5f47a1a8b31e","Type":"ContainerDied","Data":"d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a"} Apr 22 16:53:20.122762 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.122398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d4c8f55f9-44mcg" event={"ID":"01db457c-10b5-4f7c-b804-5f47a1a8b31e","Type":"ContainerDied","Data":"c3690ce20fe8c80db796a1091065378baaf315b35b6e81327047a6d3a423419f"} Apr 22 16:53:20.122762 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.122414 2578 scope.go:117] "RemoveContainer" containerID="d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a" Apr 22 16:53:20.130158 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.130138 2578 scope.go:117] "RemoveContainer" containerID="d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a" Apr 22 16:53:20.130424 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:53:20.130396 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a\": container with ID starting with d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a not found: ID does not exist" containerID="d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a" Apr 22 16:53:20.130499 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.130435 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a"} err="failed to get container status \"d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a\": rpc error: code = NotFound desc = could not find container \"d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a\": container with ID starting with d3270ce7ca15cc3cfa1e2a2a893590349aa54e437e4450adf3dbbf6d7548f30a not found: ID does not exist" Apr 22 16:53:20.144234 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.144209 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-44mcg"] Apr 22 16:53:20.148183 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:20.148161 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d4c8f55f9-44mcg"] Apr 22 16:53:21.020760 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:21.020720 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01db457c-10b5-4f7c-b804-5f47a1a8b31e" path="/var/lib/kubelet/pods/01db457c-10b5-4f7c-b804-5f47a1a8b31e/volumes" Apr 22 16:53:30.128345 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.128313 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:30.128770 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.128367 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:30.189686 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.189650 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d67ffbc94-h8zch"] Apr 22 16:53:30.189950 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.189892 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" podUID="b4348053-04df-4546-a7ac-cfa4af7500c9" containerName="manager" containerID="cri-o://8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429" gracePeriod=10 Apr 22 16:53:30.430133 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.430109 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:30.477516 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477483 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-controller-66c9db867c-vk4lw"] Apr 22 16:53:30.477820 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477808 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ec410859-46dd-499c-a8a7-266651afbe7d" containerName="authorino" Apr 22 16:53:30.477895 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477822 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ec410859-46dd-499c-a8a7-266651afbe7d" containerName="authorino" Apr 22 16:53:30.477895 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477832 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b4348053-04df-4546-a7ac-cfa4af7500c9" containerName="manager" Apr 22 16:53:30.477895 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477838 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b4348053-04df-4546-a7ac-cfa4af7500c9" containerName="manager" Apr 22 16:53:30.477895 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477860 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01db457c-10b5-4f7c-b804-5f47a1a8b31e" containerName="manager" Apr 22 16:53:30.477895 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477868 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="01db457c-10b5-4f7c-b804-5f47a1a8b31e" containerName="manager" Apr 22 16:53:30.478043 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477927 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="01db457c-10b5-4f7c-b804-5f47a1a8b31e" containerName="manager" Apr 22 16:53:30.478043 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477940 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ec410859-46dd-499c-a8a7-266651afbe7d" containerName="authorino" Apr 22 16:53:30.478043 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.477950 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b4348053-04df-4546-a7ac-cfa4af7500c9" containerName="manager" Apr 22 16:53:30.481191 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.481168 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:30.489631 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.489601 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c9db867c-vk4lw"] Apr 22 16:53:30.518130 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.518102 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8x8kw\" (UniqueName: \"kubernetes.io/projected/b4348053-04df-4546-a7ac-cfa4af7500c9-kube-api-access-8x8kw\") pod \"b4348053-04df-4546-a7ac-cfa4af7500c9\" (UID: \"b4348053-04df-4546-a7ac-cfa4af7500c9\") " Apr 22 16:53:30.520145 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.520117 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b4348053-04df-4546-a7ac-cfa4af7500c9-kube-api-access-8x8kw" (OuterVolumeSpecName: "kube-api-access-8x8kw") pod "b4348053-04df-4546-a7ac-cfa4af7500c9" (UID: "b4348053-04df-4546-a7ac-cfa4af7500c9"). InnerVolumeSpecName "kube-api-access-8x8kw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:53:30.619376 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.619325 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2c5t\" (UniqueName: \"kubernetes.io/projected/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d-kube-api-access-l2c5t\") pod \"maas-controller-66c9db867c-vk4lw\" (UID: \"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d\") " pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:30.619548 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.619477 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8x8kw\" (UniqueName: \"kubernetes.io/projected/b4348053-04df-4546-a7ac-cfa4af7500c9-kube-api-access-8x8kw\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:53:30.720602 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.720512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l2c5t\" (UniqueName: \"kubernetes.io/projected/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d-kube-api-access-l2c5t\") pod \"maas-controller-66c9db867c-vk4lw\" (UID: \"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d\") " pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:30.729218 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.729186 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2c5t\" (UniqueName: \"kubernetes.io/projected/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d-kube-api-access-l2c5t\") pod \"maas-controller-66c9db867c-vk4lw\" (UID: \"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d\") " pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:30.791601 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.791549 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:30.912797 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:30.912770 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-controller-66c9db867c-vk4lw"] Apr 22 16:53:30.915026 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:53:30.914999 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod892a6c54_a84b_486c_a5b1_2f3dae4c3e1d.slice/crio-8b46e3f1e0d8c71a2c15f5c5a887dc90ae79d8a8c747be7b99dad7caee2f7db1 WatchSource:0}: Error finding container 8b46e3f1e0d8c71a2c15f5c5a887dc90ae79d8a8c747be7b99dad7caee2f7db1: Status 404 returned error can't find the container with id 8b46e3f1e0d8c71a2c15f5c5a887dc90ae79d8a8c747be7b99dad7caee2f7db1 Apr 22 16:53:31.160943 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.160910 2578 generic.go:358] "Generic (PLEG): container finished" podID="b4348053-04df-4546-a7ac-cfa4af7500c9" containerID="8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429" exitCode=0 Apr 22 16:53:31.161352 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.160970 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" Apr 22 16:53:31.161352 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.160991 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" event={"ID":"b4348053-04df-4546-a7ac-cfa4af7500c9","Type":"ContainerDied","Data":"8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429"} Apr 22 16:53:31.161352 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.161035 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-6d67ffbc94-h8zch" event={"ID":"b4348053-04df-4546-a7ac-cfa4af7500c9","Type":"ContainerDied","Data":"d8d6ce0f2e72d19eecb94b320aa4bcca2d6efe7ac015482c666cf7525fda4e6f"} Apr 22 16:53:31.161352 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.161060 2578 scope.go:117] "RemoveContainer" containerID="8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429" Apr 22 16:53:31.162192 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.162161 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c9db867c-vk4lw" event={"ID":"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d","Type":"ContainerStarted","Data":"8b46e3f1e0d8c71a2c15f5c5a887dc90ae79d8a8c747be7b99dad7caee2f7db1"} Apr 22 16:53:31.168613 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.168597 2578 scope.go:117] "RemoveContainer" containerID="8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429" Apr 22 16:53:31.168893 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:53:31.168876 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429\": container with ID starting with 8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429 not found: ID does not exist" containerID="8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429" Apr 22 16:53:31.168950 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.168902 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429"} err="failed to get container status \"8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429\": rpc error: code = NotFound desc = could not find container \"8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429\": container with ID starting with 8a0bf26afb017908d6243898ea53214779502c5d22389b9abe63436abba80429 not found: ID does not exist" Apr 22 16:53:31.175097 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.175074 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-6d67ffbc94-h8zch"] Apr 22 16:53:31.178678 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:31.178655 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-6d67ffbc94-h8zch"] Apr 22 16:53:32.166546 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:32.166510 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c9db867c-vk4lw" event={"ID":"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d","Type":"ContainerStarted","Data":"dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260"} Apr 22 16:53:32.166919 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:32.166658 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:32.184202 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:32.184046 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-controller-66c9db867c-vk4lw" podStartSLOduration=1.848856721 podStartE2EDuration="2.184027039s" podCreationTimestamp="2026-04-22 16:53:30 +0000 UTC" firstStartedPulling="2026-04-22 16:53:30.916281416 +0000 UTC m=+1936.453012106" lastFinishedPulling="2026-04-22 16:53:31.251451722 +0000 UTC m=+1936.788182424" observedRunningTime="2026-04-22 16:53:32.183763348 +0000 UTC m=+1937.720494066" watchObservedRunningTime="2026-04-22 16:53:32.184027039 +0000 UTC m=+1937.720757753" Apr 22 16:53:33.020821 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:33.020777 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b4348053-04df-4546-a7ac-cfa4af7500c9" path="/var/lib/kubelet/pods/b4348053-04df-4546-a7ac-cfa4af7500c9/volumes" Apr 22 16:53:39.032766 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.032732 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-55b6989d99-gt7n2"] Apr 22 16:53:39.037107 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.037084 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.039371 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.039346 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"opendatahub\"/\"maas-parameters\"" Apr 22 16:53:39.040245 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.040225 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-serving-cert\"" Apr 22 16:53:39.040334 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.040225 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"opendatahub\"/\"maas-api-dockercfg-bgvcr\"" Apr 22 16:53:39.046474 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.046452 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-55b6989d99-gt7n2"] Apr 22 16:53:39.193552 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.193515 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggfw2\" (UniqueName: \"kubernetes.io/projected/27b80be1-eb39-445a-8282-8d4691c8c5f2-kube-api-access-ggfw2\") pod \"maas-api-55b6989d99-gt7n2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.193700 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.193567 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/27b80be1-eb39-445a-8282-8d4691c8c5f2-maas-api-tls\") pod \"maas-api-55b6989d99-gt7n2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.294062 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.293979 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ggfw2\" (UniqueName: \"kubernetes.io/projected/27b80be1-eb39-445a-8282-8d4691c8c5f2-kube-api-access-ggfw2\") pod \"maas-api-55b6989d99-gt7n2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.294062 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.294026 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/27b80be1-eb39-445a-8282-8d4691c8c5f2-maas-api-tls\") pod \"maas-api-55b6989d99-gt7n2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.302234 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.302202 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/27b80be1-eb39-445a-8282-8d4691c8c5f2-maas-api-tls\") pod \"maas-api-55b6989d99-gt7n2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.306693 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.306668 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggfw2\" (UniqueName: \"kubernetes.io/projected/27b80be1-eb39-445a-8282-8d4691c8c5f2-kube-api-access-ggfw2\") pod \"maas-api-55b6989d99-gt7n2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.349692 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.349652 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:39.482992 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:39.482791 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-55b6989d99-gt7n2"] Apr 22 16:53:39.485959 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:53:39.485926 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b80be1_eb39_445a_8282_8d4691c8c5f2.slice/crio-bb68ca74ae9e42e10aac79272154ab430eaeb379c0ce21d44ff1fdfcb825ec68 WatchSource:0}: Error finding container bb68ca74ae9e42e10aac79272154ab430eaeb379c0ce21d44ff1fdfcb825ec68: Status 404 returned error can't find the container with id bb68ca74ae9e42e10aac79272154ab430eaeb379c0ce21d44ff1fdfcb825ec68 Apr 22 16:53:40.194630 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:40.194583 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55b6989d99-gt7n2" event={"ID":"27b80be1-eb39-445a-8282-8d4691c8c5f2","Type":"ContainerStarted","Data":"bb68ca74ae9e42e10aac79272154ab430eaeb379c0ce21d44ff1fdfcb825ec68"} Apr 22 16:53:41.200183 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:41.200150 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55b6989d99-gt7n2" event={"ID":"27b80be1-eb39-445a-8282-8d4691c8c5f2","Type":"ContainerStarted","Data":"fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb"} Apr 22 16:53:41.200634 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:41.200314 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:53:41.217070 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:41.217014 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-55b6989d99-gt7n2" podStartSLOduration=1.142150001 podStartE2EDuration="2.216999234s" podCreationTimestamp="2026-04-22 16:53:39 +0000 UTC" firstStartedPulling="2026-04-22 16:53:39.487921641 +0000 UTC m=+1945.024652331" lastFinishedPulling="2026-04-22 16:53:40.562770857 +0000 UTC m=+1946.099501564" observedRunningTime="2026-04-22 16:53:41.215263186 +0000 UTC m=+1946.751993898" watchObservedRunningTime="2026-04-22 16:53:41.216999234 +0000 UTC m=+1946.753729947" Apr 22 16:53:43.175657 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.175621 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:53:43.216224 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.216189 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-844cfb9854-82djs"] Apr 22 16:53:43.216454 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.216424 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-844cfb9854-82djs" podUID="a9e01385-9d4f-49a5-9a5d-9db7249b9698" containerName="manager" containerID="cri-o://80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422" gracePeriod=10 Apr 22 16:53:43.455042 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.455016 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:43.631306 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.631266 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x5778\" (UniqueName: \"kubernetes.io/projected/a9e01385-9d4f-49a5-9a5d-9db7249b9698-kube-api-access-x5778\") pod \"a9e01385-9d4f-49a5-9a5d-9db7249b9698\" (UID: \"a9e01385-9d4f-49a5-9a5d-9db7249b9698\") " Apr 22 16:53:43.633345 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.633321 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9e01385-9d4f-49a5-9a5d-9db7249b9698-kube-api-access-x5778" (OuterVolumeSpecName: "kube-api-access-x5778") pod "a9e01385-9d4f-49a5-9a5d-9db7249b9698" (UID: "a9e01385-9d4f-49a5-9a5d-9db7249b9698"). InnerVolumeSpecName "kube-api-access-x5778". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:53:43.732773 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:43.732683 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x5778\" (UniqueName: \"kubernetes.io/projected/a9e01385-9d4f-49a5-9a5d-9db7249b9698-kube-api-access-x5778\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:53:44.212143 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.212109 2578 generic.go:358] "Generic (PLEG): container finished" podID="a9e01385-9d4f-49a5-9a5d-9db7249b9698" containerID="80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422" exitCode=0 Apr 22 16:53:44.212614 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.212152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-844cfb9854-82djs" event={"ID":"a9e01385-9d4f-49a5-9a5d-9db7249b9698","Type":"ContainerDied","Data":"80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422"} Apr 22 16:53:44.212614 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.212174 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-844cfb9854-82djs" event={"ID":"a9e01385-9d4f-49a5-9a5d-9db7249b9698","Type":"ContainerDied","Data":"de17b4d33063a852fcaee5462c0d04e6e47d7f7464f4d9d5402442c69e1b54ed"} Apr 22 16:53:44.212614 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.212172 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-844cfb9854-82djs" Apr 22 16:53:44.212614 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.212195 2578 scope.go:117] "RemoveContainer" containerID="80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422" Apr 22 16:53:44.220328 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.220172 2578 scope.go:117] "RemoveContainer" containerID="80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422" Apr 22 16:53:44.220465 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:53:44.220447 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422\": container with ID starting with 80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422 not found: ID does not exist" containerID="80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422" Apr 22 16:53:44.220515 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.220473 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422"} err="failed to get container status \"80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422\": rpc error: code = NotFound desc = could not find container \"80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422\": container with ID starting with 80832f4b6c12a1446cef26f149a13d44374398769181a378b71c756a83622422 not found: ID does not exist" Apr 22 16:53:44.234221 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.234188 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-844cfb9854-82djs"] Apr 22 16:53:44.237233 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:44.237208 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-844cfb9854-82djs"] Apr 22 16:53:45.020669 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:45.020638 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9e01385-9d4f-49a5-9a5d-9db7249b9698" path="/var/lib/kubelet/pods/a9e01385-9d4f-49a5-9a5d-9db7249b9698/volumes" Apr 22 16:53:47.208801 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:53:47.208773 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:54:03.430085 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.430045 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["opendatahub/maas-api-85c57ff49c-jk2c9"] Apr 22 16:54:03.430478 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.430393 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a9e01385-9d4f-49a5-9a5d-9db7249b9698" containerName="manager" Apr 22 16:54:03.430478 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.430405 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9e01385-9d4f-49a5-9a5d-9db7249b9698" containerName="manager" Apr 22 16:54:03.430478 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.430472 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="a9e01385-9d4f-49a5-9a5d-9db7249b9698" containerName="manager" Apr 22 16:54:03.438519 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.438492 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.441804 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.441775 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-85c57ff49c-jk2c9"] Apr 22 16:54:03.486497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.486459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pxb4q\" (UniqueName: \"kubernetes.io/projected/19d1ed1b-53db-4333-888d-2da49f00475c-kube-api-access-pxb4q\") pod \"maas-api-85c57ff49c-jk2c9\" (UID: \"19d1ed1b-53db-4333-888d-2da49f00475c\") " pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.486497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.486502 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19d1ed1b-53db-4333-888d-2da49f00475c-maas-api-tls\") pod \"maas-api-85c57ff49c-jk2c9\" (UID: \"19d1ed1b-53db-4333-888d-2da49f00475c\") " pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.587691 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.587652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pxb4q\" (UniqueName: \"kubernetes.io/projected/19d1ed1b-53db-4333-888d-2da49f00475c-kube-api-access-pxb4q\") pod \"maas-api-85c57ff49c-jk2c9\" (UID: \"19d1ed1b-53db-4333-888d-2da49f00475c\") " pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.587917 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.587711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19d1ed1b-53db-4333-888d-2da49f00475c-maas-api-tls\") pod \"maas-api-85c57ff49c-jk2c9\" (UID: \"19d1ed1b-53db-4333-888d-2da49f00475c\") " pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.590284 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.590258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/19d1ed1b-53db-4333-888d-2da49f00475c-maas-api-tls\") pod \"maas-api-85c57ff49c-jk2c9\" (UID: \"19d1ed1b-53db-4333-888d-2da49f00475c\") " pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.597117 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.597087 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pxb4q\" (UniqueName: \"kubernetes.io/projected/19d1ed1b-53db-4333-888d-2da49f00475c-kube-api-access-pxb4q\") pod \"maas-api-85c57ff49c-jk2c9\" (UID: \"19d1ed1b-53db-4333-888d-2da49f00475c\") " pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.750640 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.750543 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:03.874540 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:03.874516 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["opendatahub/maas-api-85c57ff49c-jk2c9"] Apr 22 16:54:03.877161 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:54:03.877132 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19d1ed1b_53db_4333_888d_2da49f00475c.slice/crio-7967f065f02fdcbc49b192c4ea2e143d7e0f66f42bdda9ad32de60f14cc88175 WatchSource:0}: Error finding container 7967f065f02fdcbc49b192c4ea2e143d7e0f66f42bdda9ad32de60f14cc88175: Status 404 returned error can't find the container with id 7967f065f02fdcbc49b192c4ea2e143d7e0f66f42bdda9ad32de60f14cc88175 Apr 22 16:54:04.281587 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:04.281551 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-85c57ff49c-jk2c9" event={"ID":"19d1ed1b-53db-4333-888d-2da49f00475c","Type":"ContainerStarted","Data":"7967f065f02fdcbc49b192c4ea2e143d7e0f66f42bdda9ad32de60f14cc88175"} Apr 22 16:54:06.156218 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.156178 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm"] Apr 22 16:54:06.159714 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.159691 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.162387 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.162309 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"kube-root-ca.crt\"" Apr 22 16:54:06.164257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.163584 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"default-dockercfg-tjbqg\"" Apr 22 16:54:06.164257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.163741 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"facebook-opt-125m-simulated-kserve-self-signed-certs\"" Apr 22 16:54:06.164257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.164010 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"llm\"/\"openshift-service-ca.crt\"" Apr 22 16:54:06.170093 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.170070 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm"] Apr 22 16:54:06.209869 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.209809 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7jkhd\" (UniqueName: \"kubernetes.io/projected/dcc9d8ce-f795-4f82-871d-9f48e896da4e-kube-api-access-7jkhd\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.210051 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.209887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.210051 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.209907 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc9d8ce-f795-4f82-871d-9f48e896da4e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.210051 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.209945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.210051 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.210010 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.210051 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.210049 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.289580 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.289538 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-85c57ff49c-jk2c9" event={"ID":"19d1ed1b-53db-4333-888d-2da49f00475c","Type":"ContainerStarted","Data":"09f58aa8ef48002cfc47caa4650db180eedcaa1a11a8fdd408d0f2f254524057"} Apr 22 16:54:06.289763 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.289660 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:06.307131 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.307086 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="opendatahub/maas-api-85c57ff49c-jk2c9" podStartSLOduration=1.459773936 podStartE2EDuration="3.307062989s" podCreationTimestamp="2026-04-22 16:54:03 +0000 UTC" firstStartedPulling="2026-04-22 16:54:03.8787637 +0000 UTC m=+1969.415494390" lastFinishedPulling="2026-04-22 16:54:05.726052753 +0000 UTC m=+1971.262783443" observedRunningTime="2026-04-22 16:54:06.305654613 +0000 UTC m=+1971.842385331" watchObservedRunningTime="2026-04-22 16:54:06.307062989 +0000 UTC m=+1971.843793700" Apr 22 16:54:06.310531 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310505 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.310659 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310541 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc9d8ce-f795-4f82-871d-9f48e896da4e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.310659 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310625 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.310767 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310674 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.310767 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310703 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.310767 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310759 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7jkhd\" (UniqueName: \"kubernetes.io/projected/dcc9d8ce-f795-4f82-871d-9f48e896da4e-kube-api-access-7jkhd\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.310989 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.310959 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-home\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.311165 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.311131 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-kserve-provision-location\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.311165 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.311156 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-model-cache\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.313040 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.313020 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dcc9d8ce-f795-4f82-871d-9f48e896da4e-dshm\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.313323 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.313303 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dcc9d8ce-f795-4f82-871d-9f48e896da4e-tls-certs\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.318774 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.318756 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7jkhd\" (UniqueName: \"kubernetes.io/projected/dcc9d8ce-f795-4f82-871d-9f48e896da4e-kube-api-access-7jkhd\") pod \"facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm\" (UID: \"dcc9d8ce-f795-4f82-871d-9f48e896da4e\") " pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.472870 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.472764 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:06.599708 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:06.599670 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm"] Apr 22 16:54:06.604240 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:54:06.604207 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcc9d8ce_f795_4f82_871d_9f48e896da4e.slice/crio-9be722db2cc7e3b02c94fec8da3754e93376dd4bf63704df2fd49248fa01f9b2 WatchSource:0}: Error finding container 9be722db2cc7e3b02c94fec8da3754e93376dd4bf63704df2fd49248fa01f9b2: Status 404 returned error can't find the container with id 9be722db2cc7e3b02c94fec8da3754e93376dd4bf63704df2fd49248fa01f9b2 Apr 22 16:54:07.294343 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:07.294302 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" event={"ID":"dcc9d8ce-f795-4f82-871d-9f48e896da4e","Type":"ContainerStarted","Data":"9be722db2cc7e3b02c94fec8da3754e93376dd4bf63704df2fd49248fa01f9b2"} Apr 22 16:54:12.301257 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:12.301222 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="opendatahub/maas-api-85c57ff49c-jk2c9" Apr 22 16:54:12.348348 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:12.348309 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-55b6989d99-gt7n2"] Apr 22 16:54:12.348654 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:12.348623 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-api-55b6989d99-gt7n2" podUID="27b80be1-eb39-445a-8282-8d4691c8c5f2" containerName="maas-api" containerID="cri-o://fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb" gracePeriod=30 Apr 22 16:54:13.690422 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.690398 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:54:13.784073 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.784042 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggfw2\" (UniqueName: \"kubernetes.io/projected/27b80be1-eb39-445a-8282-8d4691c8c5f2-kube-api-access-ggfw2\") pod \"27b80be1-eb39-445a-8282-8d4691c8c5f2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " Apr 22 16:54:13.784258 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.784147 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/27b80be1-eb39-445a-8282-8d4691c8c5f2-maas-api-tls\") pod \"27b80be1-eb39-445a-8282-8d4691c8c5f2\" (UID: \"27b80be1-eb39-445a-8282-8d4691c8c5f2\") " Apr 22 16:54:13.786187 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.786127 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/27b80be1-eb39-445a-8282-8d4691c8c5f2-maas-api-tls" (OuterVolumeSpecName: "maas-api-tls") pod "27b80be1-eb39-445a-8282-8d4691c8c5f2" (UID: "27b80be1-eb39-445a-8282-8d4691c8c5f2"). InnerVolumeSpecName "maas-api-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 16:54:13.786284 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.786263 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27b80be1-eb39-445a-8282-8d4691c8c5f2-kube-api-access-ggfw2" (OuterVolumeSpecName: "kube-api-access-ggfw2") pod "27b80be1-eb39-445a-8282-8d4691c8c5f2" (UID: "27b80be1-eb39-445a-8282-8d4691c8c5f2"). InnerVolumeSpecName "kube-api-access-ggfw2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:54:13.885623 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.885585 2578 reconciler_common.go:299] "Volume detached for volume \"maas-api-tls\" (UniqueName: \"kubernetes.io/secret/27b80be1-eb39-445a-8282-8d4691c8c5f2-maas-api-tls\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:54:13.885623 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.885618 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ggfw2\" (UniqueName: \"kubernetes.io/projected/27b80be1-eb39-445a-8282-8d4691c8c5f2-kube-api-access-ggfw2\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:54:13.957097 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.957062 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2"] Apr 22 16:54:13.957385 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.957373 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="27b80be1-eb39-445a-8282-8d4691c8c5f2" containerName="maas-api" Apr 22 16:54:13.957434 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.957387 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="27b80be1-eb39-445a-8282-8d4691c8c5f2" containerName="maas-api" Apr 22 16:54:13.957476 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.957445 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="27b80be1-eb39-445a-8282-8d4691c8c5f2" containerName="maas-api" Apr 22 16:54:13.962935 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.962906 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:13.965641 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.965480 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-simulated-kserve-self-signed-certs\"" Apr 22 16:54:13.973909 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:13.973884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2"] Apr 22 16:54:14.087159 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.087123 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.087360 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.087169 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1324db02-b05b-4630-94b8-4d288ba0a670-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.087360 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.087189 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6hff\" (UniqueName: \"kubernetes.io/projected/1324db02-b05b-4630-94b8-4d288ba0a670-kube-api-access-n6hff\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.087360 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.087251 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.087360 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.087268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.087360 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.087333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.188565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188524 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.188565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188568 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.188833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.188833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.188833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188690 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1324db02-b05b-4630-94b8-4d288ba0a670-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.188833 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n6hff\" (UniqueName: \"kubernetes.io/projected/1324db02-b05b-4630-94b8-4d288ba0a670-kube-api-access-n6hff\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.189064 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.188974 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-model-cache\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.189120 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.189054 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-kserve-provision-location\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.189213 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.189191 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-home\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.191008 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.190990 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1324db02-b05b-4630-94b8-4d288ba0a670-dshm\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.191316 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.191293 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1324db02-b05b-4630-94b8-4d288ba0a670-tls-certs\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.200025 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.199989 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6hff\" (UniqueName: \"kubernetes.io/projected/1324db02-b05b-4630-94b8-4d288ba0a670-kube-api-access-n6hff\") pod \"e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2\" (UID: \"1324db02-b05b-4630-94b8-4d288ba0a670\") " pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.275072 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.275037 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:14.322462 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.322279 2578 generic.go:358] "Generic (PLEG): container finished" podID="27b80be1-eb39-445a-8282-8d4691c8c5f2" containerID="fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb" exitCode=0 Apr 22 16:54:14.322462 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.322374 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-api-55b6989d99-gt7n2" Apr 22 16:54:14.322462 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.322412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55b6989d99-gt7n2" event={"ID":"27b80be1-eb39-445a-8282-8d4691c8c5f2","Type":"ContainerDied","Data":"fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb"} Apr 22 16:54:14.322462 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.322446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-api-55b6989d99-gt7n2" event={"ID":"27b80be1-eb39-445a-8282-8d4691c8c5f2","Type":"ContainerDied","Data":"bb68ca74ae9e42e10aac79272154ab430eaeb379c0ce21d44ff1fdfcb825ec68"} Apr 22 16:54:14.322462 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.322469 2578 scope.go:117] "RemoveContainer" containerID="fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb" Apr 22 16:54:14.327835 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.326235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" event={"ID":"dcc9d8ce-f795-4f82-871d-9f48e896da4e","Type":"ContainerStarted","Data":"32695610655b230157c5c37d430b5a0f7c419804f2aeec85abc0505d9d8967fd"} Apr 22 16:54:14.336138 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.336110 2578 scope.go:117] "RemoveContainer" containerID="fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb" Apr 22 16:54:14.336518 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:54:14.336440 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb\": container with ID starting with fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb not found: ID does not exist" containerID="fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb" Apr 22 16:54:14.336518 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.336476 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb"} err="failed to get container status \"fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb\": rpc error: code = NotFound desc = could not find container \"fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb\": container with ID starting with fdb540036af28227d227c986bbafafdd8141f03338284828b15a93b95a49fcfb not found: ID does not exist" Apr 22 16:54:14.390057 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.390021 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-api-55b6989d99-gt7n2"] Apr 22 16:54:14.392245 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.392216 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-api-55b6989d99-gt7n2"] Apr 22 16:54:14.427453 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:14.427424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2"] Apr 22 16:54:14.429562 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:54:14.429524 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1324db02_b05b_4630_94b8_4d288ba0a670.slice/crio-fd5bf6089842ead4757be898aa4e2080af7cd21d368ac8c9ffe59f4e09275383 WatchSource:0}: Error finding container fd5bf6089842ead4757be898aa4e2080af7cd21d368ac8c9ffe59f4e09275383: Status 404 returned error can't find the container with id fd5bf6089842ead4757be898aa4e2080af7cd21d368ac8c9ffe59f4e09275383 Apr 22 16:54:15.022316 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:15.022275 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27b80be1-eb39-445a-8282-8d4691c8c5f2" path="/var/lib/kubelet/pods/27b80be1-eb39-445a-8282-8d4691c8c5f2/volumes" Apr 22 16:54:15.330726 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:15.330545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" event={"ID":"1324db02-b05b-4630-94b8-4d288ba0a670","Type":"ContainerStarted","Data":"e5110fa646c3be4cfe3990d19cfe736b72e76cbc382e315f2aae776c0e918f6c"} Apr 22 16:54:15.330933 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:15.330735 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" event={"ID":"1324db02-b05b-4630-94b8-4d288ba0a670","Type":"ContainerStarted","Data":"fd5bf6089842ead4757be898aa4e2080af7cd21d368ac8c9ffe59f4e09275383"} Apr 22 16:54:20.349520 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:20.349484 2578 generic.go:358] "Generic (PLEG): container finished" podID="1324db02-b05b-4630-94b8-4d288ba0a670" containerID="e5110fa646c3be4cfe3990d19cfe736b72e76cbc382e315f2aae776c0e918f6c" exitCode=0 Apr 22 16:54:20.350042 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:20.349537 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" event={"ID":"1324db02-b05b-4630-94b8-4d288ba0a670","Type":"ContainerDied","Data":"e5110fa646c3be4cfe3990d19cfe736b72e76cbc382e315f2aae776c0e918f6c"} Apr 22 16:54:20.351165 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:20.351145 2578 generic.go:358] "Generic (PLEG): container finished" podID="dcc9d8ce-f795-4f82-871d-9f48e896da4e" containerID="32695610655b230157c5c37d430b5a0f7c419804f2aeec85abc0505d9d8967fd" exitCode=0 Apr 22 16:54:20.351289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:20.351200 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" event={"ID":"dcc9d8ce-f795-4f82-871d-9f48e896da4e","Type":"ContainerDied","Data":"32695610655b230157c5c37d430b5a0f7c419804f2aeec85abc0505d9d8967fd"} Apr 22 16:54:22.361371 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:22.361286 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" event={"ID":"1324db02-b05b-4630-94b8-4d288ba0a670","Type":"ContainerStarted","Data":"915a9b61759fb4dfab4100ded1f6d9b6c550f4d4f7850b1b895275ae19713a92"} Apr 22 16:54:22.362005 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:22.361969 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:22.363963 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:22.363937 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" event={"ID":"dcc9d8ce-f795-4f82-871d-9f48e896da4e","Type":"ContainerStarted","Data":"fa4f2d10101c101503bc804c6f9e8e73f9b9803c6c89c0f5c301a158acaab4b2"} Apr 22 16:54:22.364454 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:22.364438 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:22.386115 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:22.386061 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" podStartSLOduration=8.076287506 podStartE2EDuration="9.386048331s" podCreationTimestamp="2026-04-22 16:54:13 +0000 UTC" firstStartedPulling="2026-04-22 16:54:20.350322402 +0000 UTC m=+1985.887053093" lastFinishedPulling="2026-04-22 16:54:21.660083222 +0000 UTC m=+1987.196813918" observedRunningTime="2026-04-22 16:54:22.385713918 +0000 UTC m=+1987.922444632" watchObservedRunningTime="2026-04-22 16:54:22.386048331 +0000 UTC m=+1987.922779043" Apr 22 16:54:22.405260 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:22.405203 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" podStartSLOduration=1.355834964 podStartE2EDuration="16.405188623s" podCreationTimestamp="2026-04-22 16:54:06 +0000 UTC" firstStartedPulling="2026-04-22 16:54:06.606040763 +0000 UTC m=+1972.142771452" lastFinishedPulling="2026-04-22 16:54:21.655394407 +0000 UTC m=+1987.192125111" observedRunningTime="2026-04-22 16:54:22.40375769 +0000 UTC m=+1987.940488403" watchObservedRunningTime="2026-04-22 16:54:22.405188623 +0000 UTC m=+1987.941919334" Apr 22 16:54:33.381410 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.381377 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm" Apr 22 16:54:33.382329 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.382304 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2" Apr 22 16:54:33.611611 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.611575 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb"] Apr 22 16:54:33.616020 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.616002 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.618479 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.618458 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"premium-simulated-simulated-premium-kserve-self-signed-certs\"" Apr 22 16:54:33.624899 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.624825 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb"] Apr 22 16:54:33.755063 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.754981 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.755063 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.755020 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.755306 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.755138 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3b01e8df-84a6-4e7e-b822-639a88e25d51-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.755306 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.755184 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.755306 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.755211 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgcwz\" (UniqueName: \"kubernetes.io/projected/3b01e8df-84a6-4e7e-b822-639a88e25d51-kube-api-access-fgcwz\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.755306 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.755245 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.855875 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.855810 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.855875 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.855863 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgcwz\" (UniqueName: \"kubernetes.io/projected/3b01e8df-84a6-4e7e-b822-639a88e25d51-kube-api-access-fgcwz\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.855890 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.855933 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.855956 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856132 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.856013 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3b01e8df-84a6-4e7e-b822-639a88e25d51-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856365 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.856333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-model-cache\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856478 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.856360 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-kserve-provision-location\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.856478 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.856344 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-home\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.858103 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.858078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3b01e8df-84a6-4e7e-b822-639a88e25d51-dshm\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.858356 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.858337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3b01e8df-84a6-4e7e-b822-639a88e25d51-tls-certs\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.865523 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.865493 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgcwz\" (UniqueName: \"kubernetes.io/projected/3b01e8df-84a6-4e7e-b822-639a88e25d51-kube-api-access-fgcwz\") pod \"premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb\" (UID: \"3b01e8df-84a6-4e7e-b822-639a88e25d51\") " pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:33.926528 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:33.926483 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:34.063024 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:34.062548 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb"] Apr 22 16:54:34.068192 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:54:34.068144 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b01e8df_84a6_4e7e_b822_639a88e25d51.slice/crio-b95c68ec2472d2b07f2f49749cbd904470b0e364595c4a1bcc61a1aca588138e WatchSource:0}: Error finding container b95c68ec2472d2b07f2f49749cbd904470b0e364595c4a1bcc61a1aca588138e: Status 404 returned error can't find the container with id b95c68ec2472d2b07f2f49749cbd904470b0e364595c4a1bcc61a1aca588138e Apr 22 16:54:34.069481 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:34.069462 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 16:54:34.409096 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:34.409056 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" event={"ID":"3b01e8df-84a6-4e7e-b822-639a88e25d51","Type":"ContainerStarted","Data":"abcd29c307ab19152e6cb78e565ad3b8eac34a11cbfd875334f30b0435054979"} Apr 22 16:54:34.409096 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:34.409095 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" event={"ID":"3b01e8df-84a6-4e7e-b822-639a88e25d51","Type":"ContainerStarted","Data":"b95c68ec2472d2b07f2f49749cbd904470b0e364595c4a1bcc61a1aca588138e"} Apr 22 16:54:39.120502 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.120468 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb"] Apr 22 16:54:39.124887 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.124870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.127416 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.127393 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-unab60ef4d3a239b5143b412cab04acac3-kserve-self-signed-certs\"" Apr 22 16:54:39.135695 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.135672 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb"] Apr 22 16:54:39.303256 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.303210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.303256 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.303287 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.303808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.303363 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkz25\" (UniqueName: \"kubernetes.io/projected/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-kube-api-access-dkz25\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.303808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.303577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.303808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.303689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.303808 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.303729 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404308 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404226 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404308 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404271 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkz25\" (UniqueName: \"kubernetes.io/projected/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-kube-api-access-dkz25\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404320 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404378 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404534 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404407 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404728 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404570 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404728 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404624 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-kserve-provision-location\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404728 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404707 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-home\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.404991 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.404966 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-model-cache\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.406746 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.406702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-dshm\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.407284 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.407017 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-tls-certs\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.412384 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.412357 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkz25\" (UniqueName: \"kubernetes.io/projected/7a26cf43-cd8b-4a4c-9776-417eb13cf1a1-kube-api-access-dkz25\") pod \"e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb\" (UID: \"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1\") " pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.435976 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.435945 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:39.588202 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:39.588168 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb"] Apr 22 16:54:39.594041 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:54:39.594006 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a26cf43_cd8b_4a4c_9776_417eb13cf1a1.slice/crio-82ef432be3a4d3cb4525f4f89230905fa7ae6bffbf6467f4bc31344912c37eda WatchSource:0}: Error finding container 82ef432be3a4d3cb4525f4f89230905fa7ae6bffbf6467f4bc31344912c37eda: Status 404 returned error can't find the container with id 82ef432be3a4d3cb4525f4f89230905fa7ae6bffbf6467f4bc31344912c37eda Apr 22 16:54:40.431142 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:40.431091 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" event={"ID":"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1","Type":"ContainerStarted","Data":"c563ca90db5c414785c66d1b37d64b8e5a92c887ac6cea780d6c0f77c6092a97"} Apr 22 16:54:40.431142 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:40.431147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" event={"ID":"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1","Type":"ContainerStarted","Data":"82ef432be3a4d3cb4525f4f89230905fa7ae6bffbf6467f4bc31344912c37eda"} Apr 22 16:54:41.809658 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.809601 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d"] Apr 22 16:54:41.816904 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.816873 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:41.819612 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.819562 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-trlp-test-simulated-kserve-self-signed-certs\"" Apr 22 16:54:41.825118 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.825082 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d"] Apr 22 16:54:41.928828 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.928785 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:41.929011 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.928838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:41.929011 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.928921 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ttvg\" (UniqueName: \"kubernetes.io/projected/a690eff3-c68d-409d-b98f-56f546e296a2-kube-api-access-2ttvg\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:41.929011 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.928950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a690eff3-c68d-409d-b98f-56f546e296a2-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:41.929155 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.929030 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:41.929155 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:41.929071 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030069 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030021 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030268 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030268 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030150 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030268 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030174 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030268 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030223 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2ttvg\" (UniqueName: \"kubernetes.io/projected/a690eff3-c68d-409d-b98f-56f546e296a2-kube-api-access-2ttvg\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030471 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030329 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a690eff3-c68d-409d-b98f-56f546e296a2-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.030959 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030928 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-model-cache\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.031068 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.030998 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-kserve-provision-location\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.031343 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.031314 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-home\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.033343 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.033320 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a690eff3-c68d-409d-b98f-56f546e296a2-tls-certs\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.033904 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.033884 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a690eff3-c68d-409d-b98f-56f546e296a2-dshm\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.040912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.040879 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ttvg\" (UniqueName: \"kubernetes.io/projected/a690eff3-c68d-409d-b98f-56f546e296a2-kube-api-access-2ttvg\") pod \"e2e-trlp-test-simulated-kserve-84db68679b-zjf2d\" (UID: \"a690eff3-c68d-409d-b98f-56f546e296a2\") " pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.133532 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.133496 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:42.313289 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.313066 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d"] Apr 22 16:54:42.315700 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:54:42.315635 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda690eff3_c68d_409d_b98f_56f546e296a2.slice/crio-d7e8a3e3611642501a8c74ce76f9abf50726b995e12f053f17716dbb2d1b9064 WatchSource:0}: Error finding container d7e8a3e3611642501a8c74ce76f9abf50726b995e12f053f17716dbb2d1b9064: Status 404 returned error can't find the container with id d7e8a3e3611642501a8c74ce76f9abf50726b995e12f053f17716dbb2d1b9064 Apr 22 16:54:42.439688 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.439613 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" event={"ID":"a690eff3-c68d-409d-b98f-56f546e296a2","Type":"ContainerStarted","Data":"63ce8ddf1e42d55611136beee0066e7e1f6cee9423191fa475909b18243d9dbe"} Apr 22 16:54:42.439688 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:42.439648 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" event={"ID":"a690eff3-c68d-409d-b98f-56f546e296a2","Type":"ContainerStarted","Data":"d7e8a3e3611642501a8c74ce76f9abf50726b995e12f053f17716dbb2d1b9064"} Apr 22 16:54:43.445246 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:43.445141 2578 generic.go:358] "Generic (PLEG): container finished" podID="3b01e8df-84a6-4e7e-b822-639a88e25d51" containerID="abcd29c307ab19152e6cb78e565ad3b8eac34a11cbfd875334f30b0435054979" exitCode=0 Apr 22 16:54:43.445777 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:43.445230 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" event={"ID":"3b01e8df-84a6-4e7e-b822-639a88e25d51","Type":"ContainerDied","Data":"abcd29c307ab19152e6cb78e565ad3b8eac34a11cbfd875334f30b0435054979"} Apr 22 16:54:44.451773 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:44.451729 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" event={"ID":"3b01e8df-84a6-4e7e-b822-639a88e25d51","Type":"ContainerStarted","Data":"6efc96b6233e8607dc5b1368c3c41b54a9554a99565cd3dea4fc43e9e0aee37d"} Apr 22 16:54:44.452285 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:44.451959 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:44.473488 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:44.473425 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" podStartSLOduration=11.24238892 podStartE2EDuration="11.473405424s" podCreationTimestamp="2026-04-22 16:54:33 +0000 UTC" firstStartedPulling="2026-04-22 16:54:43.446200878 +0000 UTC m=+2008.982931587" lastFinishedPulling="2026-04-22 16:54:43.677217387 +0000 UTC m=+2009.213948091" observedRunningTime="2026-04-22 16:54:44.472324673 +0000 UTC m=+2010.009055386" watchObservedRunningTime="2026-04-22 16:54:44.473405424 +0000 UTC m=+2010.010136137" Apr 22 16:54:46.459904 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:46.459868 2578 generic.go:358] "Generic (PLEG): container finished" podID="7a26cf43-cd8b-4a4c-9776-417eb13cf1a1" containerID="c563ca90db5c414785c66d1b37d64b8e5a92c887ac6cea780d6c0f77c6092a97" exitCode=0 Apr 22 16:54:46.460381 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:46.459935 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" event={"ID":"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1","Type":"ContainerDied","Data":"c563ca90db5c414785c66d1b37d64b8e5a92c887ac6cea780d6c0f77c6092a97"} Apr 22 16:54:47.467179 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:47.467142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" event={"ID":"7a26cf43-cd8b-4a4c-9776-417eb13cf1a1","Type":"ContainerStarted","Data":"24f36d5659135937b02924aa7afa236aea3bb0084d481bce1698c9f4872537e5"} Apr 22 16:54:47.467882 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:47.467830 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:54:47.488551 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:47.488498 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" podStartSLOduration=8.112276799 podStartE2EDuration="8.488469577s" podCreationTimestamp="2026-04-22 16:54:39 +0000 UTC" firstStartedPulling="2026-04-22 16:54:46.460678227 +0000 UTC m=+2011.997408917" lastFinishedPulling="2026-04-22 16:54:46.836870991 +0000 UTC m=+2012.373601695" observedRunningTime="2026-04-22 16:54:47.487257195 +0000 UTC m=+2013.023987928" watchObservedRunningTime="2026-04-22 16:54:47.488469577 +0000 UTC m=+2013.025200289" Apr 22 16:54:48.471948 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:48.471856 2578 generic.go:358] "Generic (PLEG): container finished" podID="a690eff3-c68d-409d-b98f-56f546e296a2" containerID="63ce8ddf1e42d55611136beee0066e7e1f6cee9423191fa475909b18243d9dbe" exitCode=0 Apr 22 16:54:48.471948 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:48.471922 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" event={"ID":"a690eff3-c68d-409d-b98f-56f546e296a2","Type":"ContainerDied","Data":"63ce8ddf1e42d55611136beee0066e7e1f6cee9423191fa475909b18243d9dbe"} Apr 22 16:54:49.477207 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:49.477167 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" event={"ID":"a690eff3-c68d-409d-b98f-56f546e296a2","Type":"ContainerStarted","Data":"73683a80a78c539564a36ab473a147543c291dde5fd80ecb0632df6517932ee7"} Apr 22 16:54:49.477625 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:49.477396 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:54:49.497552 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:49.497500 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" podStartSLOduration=8.244116582 podStartE2EDuration="8.497484226s" podCreationTimestamp="2026-04-22 16:54:41 +0000 UTC" firstStartedPulling="2026-04-22 16:54:48.472759743 +0000 UTC m=+2014.009490438" lastFinishedPulling="2026-04-22 16:54:48.726127374 +0000 UTC m=+2014.262858082" observedRunningTime="2026-04-22 16:54:49.495095202 +0000 UTC m=+2015.031825937" watchObservedRunningTime="2026-04-22 16:54:49.497484226 +0000 UTC m=+2015.034214938" Apr 22 16:54:55.473746 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:55.473711 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb" Apr 22 16:54:58.484728 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:54:58.484691 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb" Apr 22 16:55:00.493882 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:00.493825 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-trlp-test-simulated-kserve-84db68679b-zjf2d" Apr 22 16:55:10.307201 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.307161 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt"] Apr 22 16:55:10.312543 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.312519 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.314944 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.314923 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"llm\"/\"e2e-distinct-2-simulated-kserve-self-signed-certs\"" Apr 22 16:55:10.321453 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.321429 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt"] Apr 22 16:55:10.377677 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.377644 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.377837 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.377675 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-546gj\" (UniqueName: \"kubernetes.io/projected/3f3a8f38-df67-4483-856e-9fe303820c63-kube-api-access-546gj\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.377837 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.377751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f3a8f38-df67-4483-856e-9fe303820c63-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.377837 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.377781 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.377837 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.377804 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.378066 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.377875 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479180 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479180 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479183 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-546gj\" (UniqueName: \"kubernetes.io/projected/3f3a8f38-df67-4483-856e-9fe303820c63-kube-api-access-546gj\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479420 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479234 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f3a8f38-df67-4483-856e-9fe303820c63-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479420 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479257 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479420 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479285 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479420 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479352 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479621 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479533 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-model-cache\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479740 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479719 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-kserve-provision-location\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.479804 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.479755 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-home\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.482077 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.482056 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3f3a8f38-df67-4483-856e-9fe303820c63-dshm\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.482451 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.482432 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3f3a8f38-df67-4483-856e-9fe303820c63-tls-certs\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.504311 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.504280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-546gj\" (UniqueName: \"kubernetes.io/projected/3f3a8f38-df67-4483-856e-9fe303820c63-kube-api-access-546gj\") pod \"e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt\" (UID: \"3f3a8f38-df67-4483-856e-9fe303820c63\") " pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.623570 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.623525 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:10.758912 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:10.758884 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt"] Apr 22 16:55:10.761421 ip-10-0-137-144 kubenswrapper[2578]: W0422 16:55:10.761383 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f3a8f38_df67_4483_856e_9fe303820c63.slice/crio-1cafe77906cd139f4fa718a4e8b31e50fca917dee294bc84a68d2a3341168963 WatchSource:0}: Error finding container 1cafe77906cd139f4fa718a4e8b31e50fca917dee294bc84a68d2a3341168963: Status 404 returned error can't find the container with id 1cafe77906cd139f4fa718a4e8b31e50fca917dee294bc84a68d2a3341168963 Apr 22 16:55:11.559486 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:11.559446 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" event={"ID":"3f3a8f38-df67-4483-856e-9fe303820c63","Type":"ContainerStarted","Data":"af2d5d65de322faededb2f29bf9bc1df2fa6576d8693a5c599f0d1775a44f8fd"} Apr 22 16:55:11.559486 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:11.559490 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" event={"ID":"3f3a8f38-df67-4483-856e-9fe303820c63","Type":"ContainerStarted","Data":"1cafe77906cd139f4fa718a4e8b31e50fca917dee294bc84a68d2a3341168963"} Apr 22 16:55:16.581502 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:16.581464 2578 generic.go:358] "Generic (PLEG): container finished" podID="3f3a8f38-df67-4483-856e-9fe303820c63" containerID="af2d5d65de322faededb2f29bf9bc1df2fa6576d8693a5c599f0d1775a44f8fd" exitCode=0 Apr 22 16:55:16.581883 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:16.581540 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" event={"ID":"3f3a8f38-df67-4483-856e-9fe303820c63","Type":"ContainerDied","Data":"af2d5d65de322faededb2f29bf9bc1df2fa6576d8693a5c599f0d1775a44f8fd"} Apr 22 16:55:17.586184 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:17.586142 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" event={"ID":"3f3a8f38-df67-4483-856e-9fe303820c63","Type":"ContainerStarted","Data":"287068bcc0521ef2b8eb7f1c8038093347136532adf97d27a864658432a73693"} Apr 22 16:55:17.586565 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:17.586355 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:55:17.606085 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:17.606041 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" podStartSLOduration=7.328703323 podStartE2EDuration="7.606027506s" podCreationTimestamp="2026-04-22 16:55:10 +0000 UTC" firstStartedPulling="2026-04-22 16:55:16.582200197 +0000 UTC m=+2042.118930888" lastFinishedPulling="2026-04-22 16:55:16.859524382 +0000 UTC m=+2042.396255071" observedRunningTime="2026-04-22 16:55:17.604718269 +0000 UTC m=+2043.141448985" watchObservedRunningTime="2026-04-22 16:55:17.606027506 +0000 UTC m=+2043.142758217" Apr 22 16:55:28.602860 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:55:28.602805 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="llm/e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt" Apr 22 16:56:15.029940 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:56:15.029820 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:56:15.032410 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:56:15.032390 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 16:57:04.605620 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.605530 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66c9db867c-vk4lw"] Apr 22 16:57:04.606159 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.605785 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="opendatahub/maas-controller-66c9db867c-vk4lw" podUID="892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" containerName="manager" containerID="cri-o://dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260" gracePeriod=10 Apr 22 16:57:04.850558 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.850535 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:57:04.965497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.965407 2578 generic.go:358] "Generic (PLEG): container finished" podID="892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" containerID="dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260" exitCode=0 Apr 22 16:57:04.965497 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.965465 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="opendatahub/maas-controller-66c9db867c-vk4lw" Apr 22 16:57:04.965697 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.965492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c9db867c-vk4lw" event={"ID":"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d","Type":"ContainerDied","Data":"dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260"} Apr 22 16:57:04.965697 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.965525 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="opendatahub/maas-controller-66c9db867c-vk4lw" event={"ID":"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d","Type":"ContainerDied","Data":"8b46e3f1e0d8c71a2c15f5c5a887dc90ae79d8a8c747be7b99dad7caee2f7db1"} Apr 22 16:57:04.965697 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.965540 2578 scope.go:117] "RemoveContainer" containerID="dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260" Apr 22 16:57:04.973470 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.973450 2578 scope.go:117] "RemoveContainer" containerID="dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260" Apr 22 16:57:04.973728 ip-10-0-137-144 kubenswrapper[2578]: E0422 16:57:04.973710 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260\": container with ID starting with dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260 not found: ID does not exist" containerID="dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260" Apr 22 16:57:04.973788 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.973740 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260"} err="failed to get container status \"dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260\": rpc error: code = NotFound desc = could not find container \"dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260\": container with ID starting with dc9332b6dbeca4cd7eb58b187d3b92dfbf90e9c3025166ddafd6cfe85389e260 not found: ID does not exist" Apr 22 16:57:04.986548 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.986528 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2c5t\" (UniqueName: \"kubernetes.io/projected/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d-kube-api-access-l2c5t\") pod \"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d\" (UID: \"892a6c54-a84b-486c-a5b1-2f3dae4c3e1d\") " Apr 22 16:57:04.988419 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:04.988395 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d-kube-api-access-l2c5t" (OuterVolumeSpecName: "kube-api-access-l2c5t") pod "892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" (UID: "892a6c54-a84b-486c-a5b1-2f3dae4c3e1d"). InnerVolumeSpecName "kube-api-access-l2c5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 16:57:05.087154 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:05.087121 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l2c5t\" (UniqueName: \"kubernetes.io/projected/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d-kube-api-access-l2c5t\") on node \"ip-10-0-137-144.ec2.internal\" DevicePath \"\"" Apr 22 16:57:05.280781 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:05.280696 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["opendatahub/maas-controller-66c9db867c-vk4lw"] Apr 22 16:57:05.283978 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:05.283947 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["opendatahub/maas-controller-66c9db867c-vk4lw"] Apr 22 16:57:07.021140 ip-10-0-137-144 kubenswrapper[2578]: I0422 16:57:07.021103 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" path="/var/lib/kubelet/pods/892a6c54-a84b-486c-a5b1-2f3dae4c3e1d/volumes" Apr 22 17:01:15.053372 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:01:15.053246 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:01:15.057582 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:01:15.057562 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:06:15.078020 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:06:15.077991 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:06:15.083824 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:06:15.083807 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:11:15.102562 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:11:15.102286 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:11:15.108406 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:11:15.108387 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:16:15.126578 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:16:15.126478 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:16:15.132729 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:16:15.132710 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:18:08.402872 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:08.402739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-n77l2_f82494b1-74f4-4de6-92d9-33d72db1cb2c/manager/0.log" Apr 22 17:18:08.523281 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:08.523249 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-85c57ff49c-jk2c9_19d1ed1b-53db-4333-888d-2da49f00475c/maas-api/0.log" Apr 22 17:18:08.970735 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:08.970687 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cl7hc_10306e03-001a-4b79-9b69-6d29e3cca8d3/manager/2.log" Apr 22 17:18:09.305155 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:09.305059 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57c8d5d679-tbq2t_9b4f3977-b9fd-4410-bffc-359c193b411d/manager/0.log" Apr 22 17:18:09.407172 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:09.407135 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-d29b2_af1ddb96-1d08-448b-92f2-3ac60dc191d9/postgres/0.log" Apr 22 17:18:10.712911 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:10.712876 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-68hkw_726ec92c-3d20-4f47-b117-123c1c25d656/manager/0.log" Apr 22 17:18:10.821955 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:10.821921 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-gw5k5_4e99c593-4805-4eb9-9a96-92377e26f40c/manager/0.log" Apr 22 17:18:11.823997 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:11.823963 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-krwf6_91629e04-1b11-48e1-bd19-5f3bcc2d3cc7/discovery/0.log" Apr 22 17:18:11.930489 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:11.930463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-644d48748b-22xlm_a4d5a411-7bea-4aa0-8513-5f299420fc6a/kube-auth-proxy/0.log" Apr 22 17:18:12.589204 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.589171 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt_3f3a8f38-df67-4483-856e-9fe303820c63/storage-initializer/0.log" Apr 22 17:18:12.596576 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.596547 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-2-simulated-kserve-7f849f6b56-g5hkt_3f3a8f38-df67-4483-856e-9fe303820c63/main/0.log" Apr 22 17:18:12.709738 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.709702 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2_1324db02-b05b-4630-94b8-4d288ba0a670/main/0.log" Apr 22 17:18:12.729628 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.729598 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-distinct-simulated-kserve-7bb4cdb4d7-z7kv2_1324db02-b05b-4630-94b8-4d288ba0a670/storage-initializer/0.log" Apr 22 17:18:12.843372 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.843268 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-zjf2d_a690eff3-c68d-409d-b98f-56f546e296a2/storage-initializer/0.log" Apr 22 17:18:12.850975 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.850949 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-trlp-test-simulated-kserve-84db68679b-zjf2d_a690eff3-c68d-409d-b98f-56f546e296a2/main/0.log" Apr 22 17:18:12.964354 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.964323 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb_7a26cf43-cd8b-4a4c-9776-417eb13cf1a1/main/0.log" Apr 22 17:18:12.974532 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:12.974502 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_e2e-unconfigured-facebook-opt-125m-simulated-kserve-75cdcc29fvb_7a26cf43-cd8b-4a4c-9776-417eb13cf1a1/storage-initializer/0.log" Apr 22 17:18:13.078323 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:13.078290 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm_dcc9d8ce-f795-4f82-871d-9f48e896da4e/storage-initializer/0.log" Apr 22 17:18:13.085428 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:13.085398 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_facebook-opt-125m-simulated-kserve-8f8dc67b7-s57sm_dcc9d8ce-f795-4f82-871d-9f48e896da4e/main/0.log" Apr 22 17:18:13.190017 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:13.189929 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb_3b01e8df-84a6-4e7e-b822-639a88e25d51/storage-initializer/0.log" Apr 22 17:18:13.197229 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:13.197194 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/llm_premium-simulated-simulated-premium-kserve-6b97b89985-9j7vb_3b01e8df-84a6-4e7e-b822-639a88e25d51/main/0.log" Apr 22 17:18:20.113261 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:20.113231 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-vzf4h_05520239-9b93-4ae7-abd6-fd7042ec092f/global-pull-secret-syncer/0.log" Apr 22 17:18:20.242949 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:20.242912 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-vphs5_4098aef1-2fba-4928-ac90-0a2b8fdc8510/konnectivity-agent/0.log" Apr 22 17:18:20.267474 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:20.267445 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-144.ec2.internal_ec42b4252b8124bf29765450697dabd4/haproxy/0.log" Apr 22 17:18:24.430279 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:24.430240 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-657f44b778-68hkw_726ec92c-3d20-4f47-b117-123c1c25d656/manager/0.log" Apr 22 17:18:24.454986 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:24.454950 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_dns-operator-controller-manager-648d5c98bc-gw5k5_4e99c593-4805-4eb9-9a96-92377e26f40c/manager/0.log" Apr 22 17:18:26.596402 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:26.596375 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c6xcj_cfd67519-226c-41ad-a582-51e6b68d30cc/node-exporter/0.log" Apr 22 17:18:26.621855 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:26.621810 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c6xcj_cfd67519-226c-41ad-a582-51e6b68d30cc/kube-rbac-proxy/0.log" Apr 22 17:18:26.646710 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:26.646687 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c6xcj_cfd67519-226c-41ad-a582-51e6b68d30cc/init-textfile/0.log" Apr 22 17:18:28.360458 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:28.360422 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-6mkdh_de2e09dd-d655-4750-a773-af55bcb94210/networking-console-plugin/0.log" Apr 22 17:18:29.059153 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.059118 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st"] Apr 22 17:18:29.059477 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.059465 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" containerName="manager" Apr 22 17:18:29.059521 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.059479 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" containerName="manager" Apr 22 17:18:29.059560 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.059548 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="892a6c54-a84b-486c-a5b1-2f3dae4c3e1d" containerName="manager" Apr 22 17:18:29.062791 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.062771 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.065323 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.065301 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lqwg2\"/\"openshift-service-ca.crt\"" Apr 22 17:18:29.066231 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.066213 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-lqwg2\"/\"default-dockercfg-w4hxb\"" Apr 22 17:18:29.066333 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.066272 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-lqwg2\"/\"kube-root-ca.crt\"" Apr 22 17:18:29.072594 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.072571 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st"] Apr 22 17:18:29.130056 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.130011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42crh\" (UniqueName: \"kubernetes.io/projected/9408aa03-43a5-4d0d-9f15-1554aadd3981-kube-api-access-42crh\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.130235 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.130081 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-lib-modules\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.130235 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.130112 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-proc\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.130235 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.130143 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-sys\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.130354 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.130258 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-podres\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.230715 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-podres\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.230944 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230719 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-42crh\" (UniqueName: \"kubernetes.io/projected/9408aa03-43a5-4d0d-9f15-1554aadd3981-kube-api-access-42crh\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.230944 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230838 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-lib-modules\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.230944 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-proc\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.230944 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-lib-modules\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.230944 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230839 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-podres\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.231163 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230934 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-sys\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.231163 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.230979 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-sys\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.231163 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.231012 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/9408aa03-43a5-4d0d-9f15-1554aadd3981-proc\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.239731 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.239705 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-42crh\" (UniqueName: \"kubernetes.io/projected/9408aa03-43a5-4d0d-9f15-1554aadd3981-kube-api-access-42crh\") pod \"perf-node-gather-daemonset-lq6st\" (UID: \"9408aa03-43a5-4d0d-9f15-1554aadd3981\") " pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.374533 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.374499 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:29.497692 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.497668 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st"] Apr 22 17:18:29.500053 ip-10-0-137-144 kubenswrapper[2578]: W0422 17:18:29.500027 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod9408aa03_43a5_4d0d_9f15_1554aadd3981.slice/crio-93b72dee9f5de76f9e2d4cb6462525497499a259fb2839c289760237848cf09b WatchSource:0}: Error finding container 93b72dee9f5de76f9e2d4cb6462525497499a259fb2839c289760237848cf09b: Status 404 returned error can't find the container with id 93b72dee9f5de76f9e2d4cb6462525497499a259fb2839c289760237848cf09b Apr 22 17:18:29.502000 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:29.501983 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 17:18:30.407987 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.407951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" event={"ID":"9408aa03-43a5-4d0d-9f15-1554aadd3981","Type":"ContainerStarted","Data":"079ec6dc38906659b19626d39e8570330c4e8c19b5fb3f028f4f7e02f03e9c00"} Apr 22 17:18:30.407987 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.407989 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" event={"ID":"9408aa03-43a5-4d0d-9f15-1554aadd3981","Type":"ContainerStarted","Data":"93b72dee9f5de76f9e2d4cb6462525497499a259fb2839c289760237848cf09b"} Apr 22 17:18:30.408444 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.408072 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:30.424229 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.424180 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" podStartSLOduration=1.424164708 podStartE2EDuration="1.424164708s" podCreationTimestamp="2026-04-22 17:18:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 17:18:30.423265535 +0000 UTC m=+3435.959996272" watchObservedRunningTime="2026-04-22 17:18:30.424164708 +0000 UTC m=+3435.960895420" Apr 22 17:18:30.747309 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.747224 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2fjmx_b427d2da-7345-4266-8029-a5e4953ca8db/dns/0.log" Apr 22 17:18:30.802153 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.802127 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-2fjmx_b427d2da-7345-4266-8029-a5e4953ca8db/kube-rbac-proxy/0.log" Apr 22 17:18:30.983114 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:30.983083 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-dl86z_b918f41f-7884-40cb-ac36-9c716d27d92f/dns-node-resolver/0.log" Apr 22 17:18:31.547192 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:31.547159 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-xzk9d_3e6d4c51-d843-4eca-9406-7639d52380a0/node-ca/0.log" Apr 22 17:18:32.513235 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:32.513192 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-krwf6_91629e04-1b11-48e1-bd19-5f3bcc2d3cc7/discovery/0.log" Apr 22 17:18:32.534123 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:32.534088 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_kube-auth-proxy-644d48748b-22xlm_a4d5a411-7bea-4aa0-8513-5f299420fc6a/kube-auth-proxy/0.log" Apr 22 17:18:33.198519 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:33.198485 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wt7t4_a17fc401-3295-45ad-8e4e-b5c7a99047fd/serve-healthcheck-canary/0.log" Apr 22 17:18:33.715351 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:33.715317 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hjb78_a657956e-d988-4229-8de7-4484bebd1818/kube-rbac-proxy/0.log" Apr 22 17:18:33.733254 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:33.733214 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hjb78_a657956e-d988-4229-8de7-4484bebd1818/exporter/0.log" Apr 22 17:18:33.753801 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:33.753775 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-hjb78_a657956e-d988-4229-8de7-4484bebd1818/extractor/0.log" Apr 22 17:18:35.799816 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:35.799781 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_kserve-controller-manager-856948b99f-n77l2_f82494b1-74f4-4de6-92d9-33d72db1cb2c/manager/0.log" Apr 22 17:18:35.840610 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:35.840575 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_maas-api-85c57ff49c-jk2c9_19d1ed1b-53db-4333-888d-2da49f00475c/maas-api/0.log" Apr 22 17:18:35.972860 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:35.972782 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cl7hc_10306e03-001a-4b79-9b69-6d29e3cca8d3/manager/1.log" Apr 22 17:18:35.994624 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:35.994592 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_odh-model-controller-858dbf95b8-cl7hc_10306e03-001a-4b79-9b69-6d29e3cca8d3/manager/2.log" Apr 22 17:18:36.075542 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:36.075484 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_opendatahub-operator-controller-manager-57c8d5d679-tbq2t_9b4f3977-b9fd-4410-bffc-359c193b411d/manager/0.log" Apr 22 17:18:36.096244 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:36.096209 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/opendatahub_postgres-868db5846d-d29b2_af1ddb96-1d08-448b-92f2-3ac60dc191d9/postgres/0.log" Apr 22 17:18:36.420495 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:36.420419 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-lqwg2/perf-node-gather-daemonset-lq6st" Apr 22 17:18:43.150810 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.150774 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/kube-multus-additional-cni-plugins/0.log" Apr 22 17:18:43.170927 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.170894 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/egress-router-binary-copy/0.log" Apr 22 17:18:43.190221 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.190187 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/cni-plugins/0.log" Apr 22 17:18:43.208679 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.208649 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/bond-cni-plugin/0.log" Apr 22 17:18:43.228509 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.228482 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/routeoverride-cni/0.log" Apr 22 17:18:43.247297 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.247269 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/whereabouts-cni-bincopy/0.log" Apr 22 17:18:43.266747 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.266721 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-s5rnp_308b982f-cb30-4a62-92f1-e88ca12b210e/whereabouts-cni/0.log" Apr 22 17:18:43.327989 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.327959 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-jjbms_7d4f23e4-772f-4bf2-86a3-25eb1a3cc274/kube-multus/0.log" Apr 22 17:18:43.463692 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.463594 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zzttm_01d73bcf-a30e-4dfb-ab2d-863123f999c7/network-metrics-daemon/0.log" Apr 22 17:18:43.481325 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:43.481298 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-zzttm_01d73bcf-a30e-4dfb-ab2d-863123f999c7/kube-rbac-proxy/0.log" Apr 22 17:18:44.503981 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.503952 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-controller/0.log" Apr 22 17:18:44.522264 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.522239 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/0.log" Apr 22 17:18:44.552590 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.552555 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovn-acl-logging/1.log" Apr 22 17:18:44.582487 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.582464 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/kube-rbac-proxy-node/0.log" Apr 22 17:18:44.605146 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.605099 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 17:18:44.621321 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.621293 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/northd/0.log" Apr 22 17:18:44.642690 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.642660 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/nbdb/0.log" Apr 22 17:18:44.665324 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.665272 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/sbdb/0.log" Apr 22 17:18:44.831418 ip-10-0-137-144 kubenswrapper[2578]: I0422 17:18:44.831387 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-glprs_8a355c35-cd73-4888-9d7b-1841477e589c/ovnkube-controller/0.log"